> Windows 10 Fall Creators Update, Microsoft disabled VBScript execution in Internet Explorer in the Internet Zone and the Restricted Sites Zone by default
I'm very surprised this hasn't been done a decade ago.
At this point, it's probably safe to assume that every feature in Windows that hasn't been touched for the last decade has severe security problems. And it's not only Microsoft -- maybe their approach to backwards compatibility makes them especially vulnerable, but similar things have happened in the free software ecosystem as well. It's basically a very insidious form of bitrot.
So what's the lesson here? Aggressively remove old features, as you suggest? Rewrite everything every few years? Software has become way too complex to closely audit everything forever...
Security only works properly when applied to all layers.
It is very educative to read about security models on Multitics, ClearPath, System 360/370, OS/400 among many other similar systems, their attack vectors, and what we ended up getting by having winner takes it all with UNIX based systems.
> This year we celebrated 30 years of Morris worm.
Wow. That was one hell of a night. Melted down our university computer labs, packed with desperate undergrads trying to complete and turn in their projects before midnight deadline...
VB's evaluation order, in which the left side of an assignment is evaluated before the right side, seems like a terrible idea. In most languages including C and C++, it's specified that the RHS is evaluated first. Since, of course, the RHS might have a side effect on the location of the LHS.
Is there some advantage to LHS-first that I can't think of?
Interesting, my intuition is the opposite: that left-to-right evaluation is clearly the better approach. I just tested a few languages and it turns out that there's no clear agreed-upon answer, but LHS first seems to be more common at least for recent languages:
"=" is syntactically just a binary operator, so I expect it to behave like other binary operators (and AFAIK all other binary operators evaluate left-to-right in almost all languages). It's special because the LHS evaluates to an assignable reference rather than to a value, but nothing stops you from evaluating the left side in full before starting to evaluate the right side. As with every binary operator, it's possible to write code such that the evaluation of one side affects the result of evaluating the other side, but of course that sort of code is really fragile anyway.
He said, "this is why I tend to use a slightly smaller subset of some of the features that C-family languages support, along with an abundance of brackets."
C does not support his first example. The standard states that the behavior is undefined.
You're probably not wrong, but I specifically remember there was something you could do in order to trick the compiler to allow modification on the left hand side to happen. It could possibly be by having two variable names referencing the same variable - for example.
Besides, the whole point was that it's usually not clever "trying to be clever". It just creates confusion. I don't personally believe the majority of people need to test the entire C spec to it's absolute boundary. A subset serves most people perfectly well.
That's stylistically ugly but not hard to predict; at some point I had it beaten into my head that ternaries are always at the bottom of the single-character operator precedence stack (I really hope all languages stick to that). I'd expect most people to know this.
> z = a & b > c | d
That is harder to predict. I wouldn't expect most people to know this one without looking up a precedence list.
I've recently started working on a code base with great comments and they are worth their weight in gold. Especially class and method level description comments. And doubly especially the class level ones.
Also there is none of the code golf in there that you have just given as an example.
I think it depends on how you think about it. As mentioned elsewhere, "=" is a special statement syntax rather than an operator in Python, but if you were to think of it like an operator, it would be evaluated in three phases: (1) evaluate one side, (2) evaluate the other side, (3) perform operation. The __setitem__ method defines how the left side participates in step 3, but IMO there's a separate LHS evaluation step that's independent of the assignment itself.
You might call this code snippet analogous:
class ShoutyNumber:
def __init__(self, n):
self.n = n
print(f'Created {self.n}')
def __add__(self, other):
print(f'Added with {self.n}')
return self.n + other.n
ShoutyNumber(1) + ShoutyNumber(2)
Yields:
Created 1
Created 2
Added with 1
But I understand that in some mental models and some implementations, it's cleaner to have the entire RHS value ready before starting any evaluation of the LHS. I think that avoids the need to have "assignable reference" as a special type of expression result.
As mentioned elsewhere, it's unspecified in C but tends to be right-to-left in practice. To be clear, my categorization of languages is just from testing it directly, I didn't look it up in the spec for any of them. But I think newer languages try to have fully-defined behavior, at least for things like this.
including C and C++, it's specified that the RHS is evaluated first.
I did not check C++ but for C, according to ISO/IEC 9899:1999 section 6.5.16 paragraph 4 on the semantics of the assignment operator: "The order of evaluation of the operands is unspecified."
Language that is complicated in syntax also ended up with discussion like that. A couple of day ago you have the VisiCalc guy talked about he need to do a stack machine against the expectation here.
Think of lisp But then lisp require parentheses
Think of j but the strange right hand side then fork/train.
It is hard for this simple thing. Even = and =: get Python dictator for life resigned.
In some languages like Go that support multi assignment, evaluation of some parts of the LHS occur first to help with pointer indirection and index clause evaluation [0][1]. I assume this makes the multi assign predictable because the locations of the LHS are known before the RHS is started.
Not closely related, but this made me think of another fascinating aspect of localisation.
In places like Australia and America, if you have a group of people and you’re going round them (e.g. reading in turns or playing various card games), you’d be more likely to go clockwise from on top rather than counter-clockwise.
But in India, they naturally go counter-clockwise from on top rather than clockwise, and I once saw someone playing a game of hearts on a phone there, and it was going counter-clockwise.
Localisation definitely entails more than just translating strings. Layout and functionality changes can be involved.
In my native Hungary we also play counter clockwise, so I looked it up[1]:
Dealing is done either clockwise or counterclockwise. If this is omitted from the rules, then it should be assumed to be:
* clockwise for games from North America, North and West Europe and Russia;
* counterclockwise for South and East Europe and Asia, also for Swiss games and all Tarot games.
Oh yes you do, or you couldn't even use this site, let alone do any programming.
For better or worse left-to-right has "won" in programming-land and I fully agree with alangpierce that it's the natural choice for evaluation order as well. We read left-to-right and the computer should as well. That's one of the main problems I have with Python. Evaluating right-to-left just feels like being contrarian.
I seem to remember my university lecturer (way back in '96) demonstrating that rightmost evaluation could lead to "variable capture", but that "leftmost-outermost" evaluation would always avoid that problem.
It's been a while, however, so I might be remembering wrong (or indeed have misunderstood).
It's crazy to me to realize that there are still people employed at Microsoft who work on VBScript. I love VBScript, lots of great memories, but I don't know anyone who uses it anymore, not even VB.Net.
A lot of business line applications are still written in VB.NET. Personally, I write all of my hobby code in VB.NET since it's more or less the same as writing C# but I find VB syntax more comfortable.
There's even claims it's the fifth most popular language out there today. https://visualstudiomagazine.com/articles/2018/12/17/tiobe-v... Though there's not a lot of other parties who agree with that, on one hand, on the other hand, a shockingly large number of developers do not work in Silicon Valley, and a lot of those developers aren't using the latest JavaScript frameworks for their day-to-day jobs.
I'm not alone! I know multiple coding languages but for fun coding (i.e. I'm not being paid to do it) I fallback to VB.Net as I just don't have to think about it, it just flows from my fingers. Once it's an EXE it's no different from C# and as no-one else has to support my code, who cares that it's in VB.Net. I actually don't understand the hate that VB.Net gets. I think maybe people are just biased based on their knowledge of bad VB code from the VB6 and earlier days.
I'm also a huge fan of ASP. It's small, fast, and super simple to deploy onto any Windows server. If you are just writing a small web based tool for end users that's easy for people to maintain, then ASP is still definitely the way to go for Windows Web servers. By comparison, I've recently converted a few ASP tools to PHP that run on the same spec of hardware (under debian, not Windows) and PHP is definitely slower than ASP.
I love vb.net and used it extensively, but given the multiple writings on multiple walls, I switched to C#. While I prefer the vb syntax, I found that using two languages simultaneously is more cumbersome than using a syntax I like less for home projects. I find that muscle memory is a powerful tool.
Hey they're great languages and if it works for you, it's great. I would just be conscious of any security vulnerabilities using older software, but outside of that, if it works it works.
You mean you like the point and click feel to these tools?
I know a few folks with mediocre high school degrees that are quite comfortable with command line tools, there's no clear divide saying that rich folks -> CLI, poor folks - > GUIs :)
Visual Interdev[0], permitted folks to build Classic ASP apps using drag'n'drop. It wasn't quite as sophisticated as Visual Studio.NET, but the basics were there. I never used it myself because there was too much "magic" and preferred to knock out my ASP using TextPad.
At the time of their development there were not many best practices yet. Microsoft promised a version 4.0 and in the process morphed that development into Asp .Net instead.
VB.NET is still taught in the GCSE (secondary school) Computer Science course in the UK. My daughters are doing it now. Fortunately they are able to run it fine in Visual Studio for Mac.
I’m teaching them a bit of a Python as well, but I’m impressed with the material they’re doing at school, it’s all good stuff even if the language is a bit clunky. Apparently they will be using Java for the International Baccalaureate in a few years.
VB.NET has about as much to do with VBScript as Java has to do with JavaScript. VB.NET is largely just syntactic sugar for C# since they both utilize the CLR. While there are differences [0], about the only one that people typically find when moving to C# from VB.NET is the fact that there's no equivalent of Visual Basic's With statement [1]. The differences of opinion about C# and VB.NET are almost entirely personal preference.
Really? I didn't know that was an option. Shame, still VB.NET isn't as painful as I'd feared it might be and the important thing is the concepts. Secondary school CS is a whole world better than it was when I was their age.
Having said that I know there are concerns that by focusing on 'real' computer science the course has swung too far away from everyday practical computing skills. Personally I think schools should have a bias towards putting weight on the academic end of the vocational-academic spectrum.
I've avoided pushing Python too hard, they have enough on their plates learning VB, but I have dipped into it with them to show how it does some things differently. I don't want to make them feel resentful about learning VB, that wouldn't be constructive, so I'm learning VB along with them. I do think it's useful to understand what things are pretty fundamental, and which things can vary meaningfully between languages though.
When I did secondary school "technology" which was the overseeing subject, we used BBC basic and 6502 assembly (some of us anyway!) to drive CNC equipment and Lego and spent half of the time with a soldering iron in hand and etching PCBs and stuff. In business studies, a wholly separate subject at the time, we learned how to use spreadsheets, word processors, write letters etc on RiscOS which had just rolled out about then.
I think the education now is abysmal in comparison. Why? A weird reason. None of the technology I learned about then is relevant now. 6502 assembly is dead, BBC BASIC is dead, RiscOS is dead, all the software packages are dead, China makes my PCBs etc. The education it gave me was a mental model of computing and how to approach problems with self sufficiency.
I feel a lot of technology platforms now, including .Net (something I have been using since day one), remove all self sufficiency from you and abstract so much away that it's harmful. I see many younger staff at companies I have worked for who's education has left them with so many gaps that self sufficient problem solving is impossible.
I doubt it (the age thing). RiscOS came out while I was at University. I think you must have come into secondary school camp sci just as it started getting good. When I did it a few years earlier (started - I dropped out) they were still teaching us how core memories and punched cards used to work. They did have a few BCC micros, but hadn't started using them in teaching. Yes, I am THAT old :)
My eldest just did a test where one of the questions was to describe the fetch and execute cycle, so they do cover low level concepts in Comp Sci GCSE.
PCBs and integrated circuits is in GCSE Electronics. Robots and control systems I'm not sure about but they have that stuff and a 3D printer and laser cutter in the Design and Technology lab. I'd have been all over that stuff in my day, but my girls prefer to spend their time in the fully equipped soundproofed music studio practicing 7 Nation Army with their rock band after school. Kids these days! And this at a public school, albeit a really good one.
Some time a go I had to work on a heavy XML processing application and found out VB.NET had a nice XML literals feature. I completed the development using it and it was a joy.
EDIT: To anyone who didn't get it, XML literals are used like this:
Dim myContact = <contact><%= p.Name %></contact>
Dim allContacts = <contacts><%= From c in db.contacts Select <contact><%= c.Name %></contact> %></contacts>
It's not that I don't like xml literals but I rarely use them. An XML serializer is almost always preferable. And they cause havoc in razor, where the editor is always confused between xml literals and html snippets.
I'm pretty sure vbScript as a language, or even the vbScript engine in Windows has not got any updates whatsoever in the last 15 years. It's still a nice language though, not to be confused with Visual Basic, (like JavaScript vs Java).
VBScript is basically dynamically typed VB6. It's not a nice language - it has all the worst parts of VB6, and then adds its own. Take a look at some of Eric Lippert's old blog posts from the days when he was working on VBScript:
To give one example, that is somewhat relevant here because it describes one of the biggest misfeatures of the language - default properties - that is used in the exploit:
I use VBScript at work. I maintain a system that was started in the early 2000s and uses Classic ASP and continues on to this day. They are rewriting it slowly but in the meantime I get the joy of keeping it running.
I too had the joy of looking after a fairly large internet facing Classic ASP system which had code dating back to the late 90's. There were some parts of the codebase that I never ever got around to completely understanding. It was a horrific sprawl of deeply nested #include's and with markup and server side script intermingled. Some of these "pages" were up to 2000 lines long and you just wanted rip your eyeballs out any time you needed to go near it.
You made me want to check to see how many lines the biggest page we have is. Turns out it is 12,533 lines long. The kicker is that it could be much longer but there is a convoluted process to adding more HTML. The results from the database are turned into XML which is then transformed via an XSLT which creates more HTML and then Javascript is used to dynamically add the generated HTML
Minor nitpick. Office embeds VBA (Visual Basic for Applications) which is pretty much Visual Basic 6. You also get the benefits of early binding and strong typing, for example:
Function DoSomething(name As String) As String
End Function
VBS is a different beast. Whilst it shares the more or less the same language keywords and constructs as VB6/VBA (most VB6/VBA code is valid VBS code), it's actually more akin to a dynamically typed language.
For example, all variables in VBS are Variants, you can't declare things as explicit types, say strings or integers. If you need to instantiate classes residing in external COM libraries you need to use the CreateObject() built-in and specify that class's "ProgID" e.g.
I thought that particular flavor was referred to as VBA (Applications). I know Office has exposed it's interface through COM to VBS since at least Office 2000 but I thought internally Office Macros were written in VBA.
And I know what you mean, regardless of what flavor it was I've had to some pretty crazy stuff using VB for Access, Excel and Outlook. Once you show something small and helpful to someone you wrote in VB they think they can automate the whole office with it. Sometimes it works, sometimes not but even when it worked it was always kind of brittle and I found myself have to code around a lot of exceptions which was pretty darn tedious.
All in all I don't think VB.net is too terrible but I would not be sorry to see a lot of legacy VB just quietly disappear. I feel like in 20 - 30 years though it will be the new COBOL given the amount of internal business systems that were implemented using it in the 90's and 2000's.
I'm a professional programmer working full-time on multiple Visual Basic .NET applications. Apart from SQL and the odd bit of HTML, CSS, and JavaScript, it's the only language I need to know. I know more and can work in more, but it's all that I and the hundreds of other programmers in the company need.
I used VB a bit after a bit of linux/bash and I was super sad that MS never (even though it's not surprising[1]) marketed VB as a system customization language for Windows'es. It's not a great language, but at least it has some kinds of type, and with COM modules you can tap into just about anything in your OS (for better or worse of course). Bash felt like a string grep hell very fast to me.
And nowadays powershell seems a great replacement.
[1] of course Windows wasn't meant for tinkerers but for users/customers, developping was for the MSDN side of things, for which you'd agree to pay because you'd sell your creations I guess
We still use it for a few services, like creating xml on our scanners that store who scanned a document so it can be automatically filed.
Moving data dumps onto and from SFTPs.
Emptying outlook mailboxes and storing their content so our SSIS/SSAS services can use it.
Stuff like that. We could use other things I suppose but, and it is slowly getting replaced, but VBScript is the only language everyone knows, from the IT technician to the developers and that had been really valuable to us.
Many people in life sciences domain use VB.NET when their macros cannot live anymore inside Excel, or for simple programming tasks without having to mess with IT.
So I got downvoted last time I brought this up, but if a large corporation hasn't fuzzed their products / code, doesn't this start to border on negligence?
Bugs will of course happen, but failure to fuzz products from companies that employ tens of thousands of people seems inexcusable.
Fuzzing is great and undefined behavior is terrible, so I mostly agree with you. But negligence is a high bar, so let me add at least a small counterpoint:
These big, old systems are full of crashes. Many of them have been triggered and investigated before, and determined not to be a security vulnerability, and so left as they are. If you fuzz these systems, you might find a lot more false positives than real exploitable vulnerabilities. It might be even worse than that, with the false positives being so many that they get in the way of the fuzzer actually discovering new issues.
If these were newer projects, it might make sense to really clean the codebase up and fix every crash. That would make debugging tools more useful, and it would avoid future cases where some compiler update turns your "harmless" crash into an exploit. But in these legacy projects, that sort of cleanup would be very expensive, and many of the devs with the expertise to do a proper job of it have long since moved on. It really might not make economic sense to invest that much in cleanup, even if Microsoft has fully internalized the cost of vulnerabilities to their users.
After you've tried setting up and conducting fuzzing (rather than just reading someone else's success stories) you'll see why we cannot assume they haven't. Fuzzing isn't a binary state, you can spend weeks Fuzzing something, and then someone else with a different methodology can find dozens of new bugs.
Fuzzing scripting languages is particularly complicated because the languages themselves have infinite state. The more complicated the Fuzzer the more coverage it has, but even writing a bespoke language specific Fuzzer is about as complicated as creating the language itself.
It should, but it doesn't. People haven't quite figured out that when software becomes infrastructure that software engineering should look a lot more like civil engineering.
One interesting bit: If an attacker-controlled data is interpreted as a VBScript variable, this can result in a lot more than just infoleak and can easily be converted into a code execution. This issue is a good example of why, in general, an out-of-bounds read can be more than an infoleak: it always depends on precisely what kind of data is being read and how it is used.
It feels like the language got too much features and then the model got so complicated that devs could no longer correctly reason about it. I would suspect that such errors are less likely in a simple language like lisp.
I'm very surprised this hasn't been done a decade ago.