In my OS course, in 1969, I wrote an IPL (bootstrap) program that copied cards to the printer, running in supervisor mode. It ran on a 360 virtual machine under the MTS time-sharing system. Before that, I wrote programs in assembly language for an IBM 7044, and I even was allowed to operate the 7044 on some graveyard shifts. I also wrote programs that ran on a CDC 3600, a Univac 1108, and a Honeywell 200.
Not sure which Fortran this refers to. I never used Fortran I, but as I understand it, names were up to 6 characters long, first character alphabetic; names with initial letter A-H and O-Z were REAL, I-M INTEGER (Fortran II added declarations to override the defaults). Dartmouth Basic restricted names to a single letter and an optional digit.
Incidentally, the various Autocode languages of the 1950s in Britain had 1-character variable names.
That’s super interesting. It would have been some mainframe fortran from the 1970s because I remember him bringing me and my brother as children into a university computer lab where he had weaseled some time on a mainframe so he could punch cards. He told me the variable naming thing (and was prone to exaggeration) so it might not even be true - I can’t ask him now as he’s writing pseudo fortran implementations of Newton-Raphson with short variable names in the great computer lab in the sky at the moment.
One very small correction: QUIKTRAN wasn't a “mathematical utility”, but an early timesharing system, I think running on a 7044 (coincidentally, my first mainframe). It offered an interactive Fortran system, with editing and debugging facilities. IBM's later CALL/360 system was a successor to this, adding PL/I and Basic.
Interesting UX fact: IBM researchers looked at user satisfaction on this system. They found that it wasn't poor response time that bothered people, but variability of response times. If users couldn't predict how long an operation would take, that bothered them. So they inserted delays so that average response times were maybe longer, but variance was lower. And users were happier.
I can't remember the number of times I have given up on software because there is no “Concepts and Facilities” document that gives me a mental model of what the program will do and how I operate it. Instead, one sees a website with a mass of unrelated documents that I'm supposed to read in some order, and divine how everything works.
Maybe because I'm a child of the 20th century that I value documentation that gives me a path from understanding what and why to learning how I can use it, with a master index so I can look up exactly what the “Mung Until No Good” command does, and see which other commands relate to it.
Perhaps one day generative AI will be good enough that we can feed in one of these websites, with its pages with titles like “Migrating from V4.7.2 to V4.7.3” and “Building for OS/2 Warp” (I exaggerate, but only slightly) to documentation that is useful for learning, use, and troubleshooting. I live in hope.
Not just an issue for documentation, but also for marketing materials. I can't count the number of times I've looked into some app that was referenced vaguely in a comment somewhere, only to have to dig for half an hour on that thing's web site just to figure out what the thing is supposed to even do in the first place...
I haven't looked at it, but it's purely functional (so no destructive operations such as set!). It can't be called Scheme, because it's a subset of RnRS Scheme; that's why Racket isn't called PLT-Scheme any more. I can imagine this as a teaching tool (though the FAQ says the error messages aren't good), or perhaps usable as an extension language.
I'm going to look at it as a scripting tool that compiles to C.
I used to work for a company whose internal communication often claimed “world domination” as its ultimate goal. I just looked at revenue estimates for its market sector, this company isn't in the top 5, and is far behind the leader. Let's just leave Owl's world domination goal as aspirational.
It's the other way around for Racket, where R5RS and R6RS are subsets :-)
IIUC Racket's new name came about from, basically, brand confusion: to avoid being (mis)understood as "yet another implementation of Scheme" rather than as the thing-in-itself it had become.
I have a smart TV running Roku. I still use cable (for news), so the other 2 sources I see are the 2 computers I have connected. The only time I ever see Roku is when I'm selecting a source. The TV is smart, but I'm smarter: no Ethernet cable.
I was 13 when DTSS was introduced, so never had an opportunity to learn programming with Basic. Fortunately, that didn't harm me, and I've managed to compensate for this disadvantage.
I don't want in any way to minimize the impact of a language designed for non-experts. But, while Basic, and its many limitations, was the best that could be done with the relatively limited systems it was first implemented on, it doesn't scale. I recall, around 1970, building an interactive front end for an inventory system, using a commercial company's version of Dartmouth (or GE) Basic. It came to about 900 lines, and even I couldn't make sense of it.
It's a mistake to believe that non-experts write 20-line mortgage programs, or 50-line dice games. If what you're teaching them has any value, they will naturally want to write programs that grow organically as they understand the problem better. Dartmouth Basic is a language in amber, best understood as what could be done given the equipment of the 1960s, and the understanding of programming development at the time. It was neither better nor worse than other interactive languages of the time, for example, JOSS (which begat PIL, DEC's FOCAL, and even the horrific MUMPS, closer to our time).
I think that the true value of Kemeny and Kurtz's contribution was encouraging programming as a thing for “ordinary” people, rather than a priesthood. The language they invented was developed prior to clear understandings of structured, object-oriented, and functional programming, all of which have something to say even to non-experts. (And, yes, Microsoft continued to produce products with “Basic” in their names, but they have little to do with anything that was developed at Dartmouth.)
So, kudos to all the folks who learned their programming with Dartmouth-style Basic. But I think there are a lot of modern tools that not only help non-experts write short programs, but scale well as their knowledge and skill grows. Smalltalk was one system that demonstrated that, but in more recent memory, Python and Racket are also good examples.
By comparison with film, Georges Méliès did some amazing work in 1900, but nobody would confuse that with the work of modern directors.
(I don't want to get into a discussion of What Is The One True Introductory Language; I have my opinions on that, but they are not relevant here. Instead, I am trying to put the very significant contribution of Kemeny and Kurtz—democratizing computing—into what I see as a better perspective.)
I mean, I don't disagree, but you might be surprised at the scale of systems that were written in BASIC, well into the 1990s and probably beyond that. And not the modern Microsoft Visual incarnation of it but the old, line-numbered, GOTO/GOSUB and everything-is-global classic style.
For a while I worked at a financial company and all their internal systems were in BASIC. They had dozens if not hundreds of internal users, all running on dumb terminals connected to a couple of servers that ran those BASIC programs. This was online transactional systems as well as nightly batch jobs. The programmers were mostly not computer science people but ordinary, smart people who understood the business and did a good job with the tools they had. It wasn't all a mountain of spaghetti, they had put a lot of thought into their standards and practices and documentation and it was pretty easy to work on.
It was used for far more than short programs and teaching.
I am not surprised that large-scale programs were written in Dartmouth Basic, or that those programs satisfied the needs of their users. Some program libraries include programs in unstructured Fortran IV, which have been running satisfactorily for 60 years. Similarly, some very complex hospital systems were written in Mumps (aka M), in which, due to both the language and the programming style used, programs looked more or less like line noise.
I once consulted for a company that had a product written in Pick Basic. This product had been sold around the world, and was very successful in their market. They wanted to modernize the product, so they went to a big DBMS vendor with a target business problem; the vendor said it would take several months to produce a sample solution. The company gave me the problem description. The next day I went back with a PoC program, 50 or so lines of C++. I emphasized that I had used C++ just because it was convenient (and that really any standard language would do), explained what parts of the problem were not addressed, and estimated that the entire program would take about a week's work to do. The client agreed on this, but said (a) that they needed a solution that was compatible with Pick Basic, and (b) their programmers only knew Pick Basic and wouldn't be any good at learning anything else. I don't know what became of the client and their product. (The Pick system was a combination of a variant of Dartmouth Basic and a DBMS.)
I'm not in any way saying that Dartmouth Basic is useless. I am saying that creating this language was NOT the flash of genius Kemeny and Kurtz actually had, but that making computing accessible to non-experts was the actual point.
OK, since we're wasting our time on the Pointless Editor War, I shall confess I know exactly one thing about Vim, the command `:q!'. That's all I've ever needed to know.
Many years ago, I was a user of an IBM 1130 computer system. It filled a small room, with (as I recall) 16K 16-bit words and a 5MB disk drive, which was quite noisy. I would feed in my Fortran program, then head down to the other end of the building to get coffee. The computer would think about the program for a while, and then start writing the object file. This was noisy enough that I'd hear it, and head back to the computer room just in time to see the printer burst into action.
(Please feel free to reference the Monty Python “Four Yorkshiremen” sketch. But this really happened.)
But I liked the PDP-8 best.
So there!
reply