Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Late 70s and 80s: forget BASIC, we had Pascal and C (retrofun.pl)
85 points by ikari_pl on Jan 4, 2024 | hide | past | favorite | 82 comments


Yes, we had Pascal and C in the 80s. But what kid growing up in the 80s would go out of their way to purchase a Pascal compiler and learn an academically oriented language when their home computer comes with a very accessible built in programming language, providing direct access to sound and graphics and all the things they need to program their very own game? That most of their friends also had access to and for which the computer magazines at the time provided program listings with detailed explanations? The rare kid that wanted to wring out more performance than BASIC could provide was happy to explore the depths of programming in assembly. I know I had my share of fun making a connect-4 player in Z80 assembly that used the color attribute memory on the ZX Spectrum to search all variations. Learning Pascal a few years later in my CS studies was early enough for me.


> But what kid growing up in the 80s would go out of their way to purchase a Pascal compiler and learn an academically oriented language

Me, and most of my friends pirated Turbo Pascal and had a blast with it. This was the early 90s mind you. Rapid Windows GUI prototyping, inline assembly and compile to .EXE, way better than qbasic / gwbasic of before. And it took years for Visual Basic to catch up (if it ever did).


The 90s is significantly later though. In the mid-1980s I had (access to) a computer with 4096 bytes of RAM. Few enough that you could have a printed chart of what they're all for because that's just 64 x 64 grid.

Because the BASIC interpreter lives in ROM, that's "free". Some of the 4096 bytes is house keeping data for the system, but most of it can be used by your program and your data. Whereas Pascal doesn't live in the ROM, so if we're going to run Pascal software the entire Pascal runtime must fit in RAM before our program and data.

By the late 1980s I had a computer with 128kB of RAM, and sure that might have been a good place to try Pascal, if I'd been aware of Pascal, but I was not.


Yes, to anyone who was there the difference between computing in 1980 vs 1989 is night and day. Personal computing changed completely and you really either wanted assembly or BASIC (OK Fourth if you're kinky) in the early 80s with 4k of RAM... because that's what other people had. TurboPascal was released about '84, but I didn't see anyone using it until '86.

I feel like Pascal was a nice gateway into C from Fortran/BASIC, but it arrived so late I'm not sure it was really efficient in terms of learning.


In 1986 and early 1987, PC's did not have more than 640kb of RAM, just about all the programming tools for IBM PC's were limited to 64kb program size, and there were all kinds of really mickey-mouse ways to try to get around those limits. There were 80286 computers that could use 16 MB without too much trouble, but in 1987 programming tools that could use the full 640 kb of the 8086 and link separately compiled code showed up (e.g. Turbo Pascal 4.0), the same year the the 32-bit 80386 desktop PC's were launched. Borland's Pascal and Delphi pretty much matched the capabilities of Microsoft's C and C++ through those first disorienting years of change.


I definitely had Turbo Pascal in the 80's as a 16/17 year old. Started with version 2.0, in 1985. I still have a Turbo Pascal 3.0 manual and a set of 5.5 manuals (the latter of which included a nice little booklet intro to OOP.) Just checked, the copyright page of the 3.0 manual is copyright 1983, 1984, 1985. The 5.5 manual is 1987,1988. I managed to get a summer job programming in Turbo Pascal in 1987.


In 1980s I used (pirated) Hisoft Pascal on a Zx Spectrum clone (48 KB of memory). It wasn’t as nice as Turbo Pascal but it was great for computations - by 8 bit standards it was very fast. I learned about Pascal from a computer magazine article that disparaged Basic and encouraged “structured” programming.


>> But what kid growing up in the 80s would go out of their way to purchase a Pascal compiler

On the IBM PC, Turbo Pascal was cheap and hugely popular in the late 80's and early 90's for hobbyists. Mind-blowingly faster than BASIC and you could mix in assembly. Lots of demos and games (and BBS doors) were written in Pascal.

http://archive.gamedev.net/archive/reference/listed82.html?c...


In 1983 Turbo Pascal sold for about $50.00 meaning some 40 years later that price would be equivalent to about $250.00 given an inflation figure at about 3.5% per annum. By comparison, a very expensive IBM PC shipped with a free version of Microsoft BASIC burnt into the ROM.


True, but by comparison Microsoft BASIC was, for lack of a better word, useless for anything serious.

Plus, don't forget hobbyists would 'borrow' software from each other, you'd get a free (possibly older version) of a compiler with a book...

I think I paid $69 for Turbo Pascal (spent all my birthday money in 1992), previous to that I had an old version of Microsoft Quick Pascal that was only $14.95 from Surplus Software. As well as Turbo C 1 that came with a book from SAMS publishing.


Not contradicting you, just adding the context that there was BASIC compilers that could speed things up significantly. Not sure of the cost or the the relative performance, but it was fast enough to build commercial products with.


This is a very good description of precisely what happened. People forget that Pascal & C compilers were not only unusual/academic, they were also very expensive - $300-$500 or more, and C was not ubiquitous then, pascal was more common, and (like C) there were several incompatible variants competing for marketspace. For a young person who has (perhaps only barely) been able to afford a Commodore, Atari 400/800, TRS-80 or the more expensive Apple ][, and additional huge expense for a compiler was very likely out of the question.

I was in a computer club and I remember a big bearded guy from IBM (one of the club founders) introducing a new programming language called "C", and thinking "why bother, when you can just code in assembly?"

My experience was with the Apple ][.

Instead of Pascal or C, almost everyone went from Applesoft Basic straight to assembly, with LISA being the tool of choice. (Laser Systems Interactive Symbolic Assembler - https://en.wikipedia.org/wiki/Lazer%27s_Interactive_Symbolic... written by Randy Hide, a noted Apple 2 expert who also wrote this guide: http://www.appleoldies.ca/anix/Using-6502-Assembly-Language-...)

We had 1Mhz. One "core". 48kilobytes of addressable ram (actually less, once the buffers for video space were subtracted). That's an extremely limited space.

Of course the workflow was awful. boot up. Open your editor. Edit your source code. Save to disk. Quit your editor. Run the assembler. No errors? quit and exit to DOS. Reboot & run your new code. (If the program were large, boot to DOS tools disc and copy the code from one floppy to another, then boot that new floppy.) Lather, rinse, repeat.

Applesoft Basic and Assembly were enough for me, until I took my first professional programming job in the 80s and learned Pascal to code for the Lisa and then the Macintosh. I learned C in the late 80s for Windows 2.1, 3.0 and then C++ (beginning with the Microsoft C++ beta compiler distributed on something like 20 5.25" floppy disks. I moved to C# as soon as Microsoft introduced DotNet and others in the decades since, but I never had as much pure joy as the time I spent on my Apple ][. I still have it, and remarkably it still runs 100%, and most of my floppy discs (memorex! gorilla!) still work, even though I used a "nibbler" tool and used both sides. At $5 each (!!) for 80k of storage we had to stretch the budget.


Many people's (including mine) first experience with Pascal was with Apple Pascal on the Apple II. This was a port of the UCSD Pascal system by Apple itself and was basically its own operating system with its own editor and even disk format. It compiled to "P-code" which meant in theory you could run your compiled software on the various big machines UCSD was developed for (not that I ever tried that). Yes, it was pricey if purchased legally, but as with almost all Apple ][ software, that wasn't how people generally obtained software.

https://en.wikipedia.org/wiki/Apple_Pascal


I also finally got to try out UCSD Pascal for the Atari 400/800. It was an awful experience using multiple floppies and having too many phases or whatever to edit, compile, run a program. That cured me of any programming language FOMO for decades. There was a cool Action![0] language cartridge for the Atari 8-bit which was awesome, it was lower level than C and felt like writing high-level assembly.

Using MS C (version 3 and onward) on PC and Megamax C on Atari ST were great though. I didn't run into C++ until the 90s on Windows, NT and OS/2.

[0] https://en.wikipedia.org/wiki/Action!_(programming_language)


> Yes, we had Pascal and C in the 80s. But what kid growing up in the 80s would go out of their way to purchase a Pascal compiler and learn an academically oriented language when their home computer comes with a very accessible built in programming language, providing direct access to sound and graphics and all the things they need to program their very own game?

Me... I saved and bought a pascal compiler for my C=64.


You would buy, or rent from the local library, a book for ZX Spectrum and type that Pascal compiler source code in.

https://worldofspectrum.net/pub/sinclair/books-pics/p/Pascal...


That never happened. At least, not in my circles. The progression was typically Basic of some variety (whatever your system shipped with), peek & poke, machine code, assembler and later on either C or Turbo pascal assuming enough time had passed.

Pascal on 8 bit was just too damn slow compared to the alternatives to be very useful and compile times were horrendous. Even the university students that had access to Pascal compilers for the most part only used them to turn in their assignments and avoided it for everything else.

I did a little contest with a friend: we both wrote a chess program, he wrote his state of the art with a ton of optimizations in Pascal, I wrote mine in assembler. His program never stood a chance against a very simplistic implementation that only had minimax and a greedy (strictly material based) evaluation routine.

Good times :)


I had a TRS-80 color computer that had BASIC built in and the first alternative language I got was a “editor/assembler” ROM cartridge.

By the mid 1980s they came out with an operating system called OS-9 which was a multitasking OS inspired by UNIX.

https://en.wikipedia.org/wiki/OS-9

You could get a few languages for that, including a C compiler. A lot of people were using C on CP/M so I would type in programs from Byte magazine. I wrote a FORTH (in assembly language) for OS-9 that was subroutine threaded and unusually had a C like API to access files instead of the block based I/O most FORTHs had.

By the late 1980s I had a 80286 machine and wrote a lot of Assembly, C and Turbo Pascal. I thought Turbo Pascal was better than C but switched to C when I got to college because it was portable to the Sun workstations they had. By the time I got to grad school I got a 486 and switched to Linux.

Personally I liked 8086 assembly a lot and even though the segmentation scheme it used could be awkward I thought it was fun to code.


I chose C (Turbo-C) because it was much better at strings than Pascal: specifically it had strdup() and their length was not limited to 255 characters. Also C had escape sequences- actually Turbo Pascal had them, not sure about standard Pascal.

I think earlier versions of Pascal had only fixed length space padded strings.

Yeah, 6809 and 8088 (arguably an 8-bit CPU) supported C just fine. The other 8-bits, not so much. I wish the CoCo had been a better computer than it was (crappy keyboard, not a real serial port, terrible video). My Atari 800XL was better except for its CPU.


Strings with maximum length are a bummer but null terminated strings have their own problems and are part of the whole "unsafe at any speed" nature of C.

The thing about the CoCo was that Radio Shack sold an absurd number of peripherals for it.

I got a "multi-pak" which would let me plug in four devices to the cartridge slot and I had a real UART for it. The built-in sound was awful but you could get an Orchestra-90, a speech synthesizer cartridge, etc. Most of the hardware and software at the time supported hardware flow control which made the bit-banger serial report more reliable than you might think at low speeds.

The video was a real problem, I think a lot of software didn't get ported to the Coco because everybody thought 40 column text was bad enough, who wants to deal with 32? For instance I never saw the Infocom games being available for the Coco although I found out just recently that Infocam had made a Z-Machine interpreter for it. Eventually they came out with a Coco 3 that had 80 column text and high resolution modes with a lot of colors but it was getting late at that point.


We (my friend and myself who had identical machines) managed to squeeze 64x32 (256x192, so the font (4x6!) wasn't exactly spectacular but it worked well enough for real use) out of the 'high res' monochrome graphics mode (PMODE 4) and that really made life easier.


The fixed string were a thing in ISO Pascal, most dialects had better string support, and even ISO Extended Pascal ended up improving them, although by then no one cared.


There was a UK clone of the CoCo that had a half decent keyboard, the Dragon 32 (actually: it had 64K if you tickled it just so).


I'm currently running CP/M, on a Z80 single-board computer with 64k RAM.

Compiling a simple snake, or tetris game, takes about two seconds. CP/M with turbo pascal is pretty fast.

I started my journey with ZX Spectrum BASIC, then z80 assembly, before jumping to MS-DOS and x86 assembly. Learning C, and other languages, didn't start until my 16-bit days.

(Though I have a memory of receiving a FORTH system for the ZX Spectrum I don't think I did anything useful with it, or fully understood it given that I was about 15 at the time.)


Very nice, what are you running, one of Ciarcia's boards or something else entirely? And what do you use for mass storage?

When I was in 8 bit territory the best I could get was 360K floppies and running any kind of serious compiler required multiple floppy swaps to get the passes in. 6809 with 64K RAM was the best 8 bit system that I had, the next best was a BBC Model B with a very much maxed out Solidisk (more than maxed out, we ran more banks than were officially supported).


A random board I came across by accident - low production run, but decent despite that.

The board uses an USB-stick for storage, formatted as a FAT filesystem which makes exchanging files between this and another system easy - though of course you can transfer files over the serial port.

The biggest issue is that the board has only 62k of RAM so I'm stuck with CP/M 2.2, rather than 3.0 which requires paging-support. Still I can run infocom games, BASIC, Turbo Pascal, and other retro things.


I'd love to know more about your board, it sounds like a very interesting one. Is it in S100 bus format or something bespoke?


Something bespoke.

My code repository contains a link to a youtube channel where the board was discussed, and where I found it randomly. But sadly the upstream site of the provider and the (useful) forums it hosted are gone unless you use the wayback machine:

https://github.com/skx/z80-playground-cpm-fat


As an alternative - Agon Light 2 (eZ80 CPU). By default it comes with BBC BASIC modified to take advantage of 24 bits addressing, but it can run CP/M if you want. The nice thing about Agon is that it can run standalone if you add to it a keyboard and a VGA display.


Writing complicated software in assembly is a pain. The famous "Wizardry" (one of the earliest CRPGs released in 1981 for the Apple ][ and eventually ported to other systems) was written in Pascal. Making all the code to support the various character classes, weapons, monsters, etc. was far easier in a reasonable structured language rather than either Basic or assembly.


Hm. I actually found assembly to be quite manageable. It required a lot of discipline but then again, that goes for most programming languages. The bigger problem is the use of local variables and parameters in assembly isn't exactly a 'free abstraction', but once you have a mechanism for that (relatively easy to set up using macros) it all becomes much more manageable (but then you're 20% on the road to writing a C compiler...).


I remember more than once disassembling pieces of C code and loudly complaining about the poor quality of it while hand optimising it by deleting about half the instructions even after I got my Amiga. On the C64 the notion of C or Pascal didn't cross my mind - a few BASIC compilers were somewhat moderately popular.

E.g. Oasis Software had White Lightning FORTH ad Basic Lightning[1] and a few others. I never tried the FORTH version, but Basic Lighting and the successor Laser Basic and Laser Basic Compiler started with a bunch of BASIC extensions and eventually added compilation on top of that. The extensions did things like install raster interrupts to virtually increase the number of sprites and provide sound routines, as well as add pre-emptive multitasking(!) on the C64. If I remember correctly, 3-5 BASIC tasks + sprite animations could run concurrently - I don't recall seeing much point, because around then I started with assembler and cycle-counting raster interrupts was suddenly where it was at.

I vividly remember the demo programs the screenshots on the page linked below are from, though...

Who'd want Pascal when you had stuff like that, though? My introduction to Pascal came on my dads boring and slow ITT XT PC, mostly out of curiosity as to how anyone could use a machine that was so obviously inferior to my Amiga - monochrome, no bitmap graphics, no proper sound, less memory (I had a 512KB expansion, for a full megabyte)... Of course it had a harddrive, but it also costs 3 times as much... But I did end up experimenting with Turbo Pascal and some dBase for a while.

[1] https://www.jimbrooks.org/archive/programming/forth/WhiteLig...


When I bought Turbo Pascal in 1987, the retail price was $59.95 (USD), IIRC. A little earlier it was less, $39.95 IIRC. Pascal was an academically-oriented language, which would only let you use variables correctly according to their declared type, but Borland Pascal included a very useful and sensible set of loopholes for getting around all that in the many cases when that was necessary to make a program useful in the real world. It was a non-academic implementation of an academic language.


My first programming language was Commodore BASIC. It was already there when you booted up the machine, loading another compiler never crossed my mind.


You are spot on, was there. The Ataris and ostensibly the Timex/Sinclair's were doing graphics from the very beginning, with their built in BASICs; typing in programs from the popular magazines. The Atari MAC/65 assembler came in 1982; the cartridge in 1984. Manx C in 1986 for Amiga. My school taught BASIC from "Micro-Soft" on an Altair. I went directly to doing bit level ops on Burroughs mainframes directly afterwards. And I still have all my Lattice C software for my Amiga circa 1990.


By the way, the original Atari models, the 400 and 800, did not have BASIC built in. (They may have been bundled with the BASIC cart, I don’t know.) Subsequent models had it built in.


> what kid growing up in the 80s would go out of their way to purchase a Pascal compiler and learn an academically oriented language

I learned Pascal on my Atari ST as a 18yo. I had already done commercial games for years in assembly for the ZX Spectrum and had learned m68k, but writing GUI programs for the ST in assembly seemed very annoying. Purchasing the compiler, though... was not really what happened.


C didn't work well on 8 bit CPUs (although it did exist). Z80 was infamously difficult to target for C code. See this stackoverflow topic: https://retrocomputing.stackexchange.com/questions/6095/why-...


I've always thought the problem is that C today doesn't target 8-bits well? The optimizations that most sources mention mostly did not exist in the first version of C


No, it's most high level languages, in general. Pascal or Standard ML for that matter, would suffer just as C does.

Languages like C and Pascal really demand that you have a nice, cheap stack. Classically, a function call means pushing all the parameters to the stack, and then the return address, and then jumping to the function, then pulling all the parameters off the stack, and pushing the return value on the stack, then modifying the stack pointer to drop the parameters, and then reading the return address off the stack.

Neither the Z80 or 6502, have instructions that provide an efficient stack, which works with multi-byte data. You end up constantly manipulating the stack pointer (usually for a software-implemented stack) with slow 8-bits-at-a-time arithmetic. Painful.

The PDP-11's instruction set in comparison, provides not just one, but up to seven, flexible 16-bit stacks.


That's some great insight! Mind if I quote it in the post itself?


Thanks, and sure, quote away.


The classic C compilers were known for producing horribly unoptimized code.

It took many years of compiler progress to change C into the performant language we all know and love/hate today.


No, C on 8 bits was terrible. You could get it to work but the compilers were slow and the code was bloated compared to assembly.


Even in the '90s, Diablo was a mix of C and assembly for the bits that C was too slow for!

I've heard it said that, if they had the C compilers of today back then, the C code might've been good enough -- hardware being constant -- for them to have totally ditched the assembly, but I don't know how true that is.


Yes, but those C compilers of today run on the hardware of today! You wouldn't have been able to compile those C compilers to the point of efficiency required to build them in the first place. Lots of technology is like that. In order to do VLSI you need to do LSI, in order to do LSI you need MSI and so on all the way back to thin film circuits and then individual components. You can't normally skip such steps it's all incremental improvement on what's already there. In software it isn't much different and that software rode in lock-step with the hardware. The best I could do on a relatively fast micro of the 80's in terms of control, expression and eventually execution speed was a very nice macro assembler and given the speed of the storage devices and the raw clock speed of the CPU that was an accomplishment in itself. It pretty much maxed out the hardware. In spite of using clever stuff (hash tables for symbol lookups for instance, which ate up some of that precious memory, RAM disks etc) it still gook a good bit of time to assemble a 16K ROM image.

But then I got an Atari ST with a whopping 1M of RAM, a 10 MB hard drive and Mark Williams excellent C compiler and it was off to the races. That was a real game changer for me.


And many/most of them were subsets of the C you'd get on, say, a PDP-11, at the time (although I understand they got better over time). That was what always put me off using C on the 8-bitters. Turbo Pascal on the other hand.......


Phh... Wealthy PC owners had Pascal and C. In the early 80s, us plebs had 8-bits that ran BASIC or we programmed assembly. Hell, my parents wouldn't even pay for an assembler for my VIC-20 so I programmed 6502 machine code by hand with POKE and DATA statements with the explicit opcodes.

Sure, if your dad was an accountant or something, and let you use his PC, you might have access to Turbo Pascal or something, I dunno. But that was a very expensive machine, and market penetration on 16-bit machines was extremely low until the mid80s.

Late 80s, different story.


Heh, how the other half live!… My memory of programming in the eighties, was a pad of ruled paper, and a rotation of Usborne books from the library.


There is an unwarranted hate against BASIC. BASIC has of course also evolved since then and you can still write professional software in numerous BASIC dialects just as in Pascal.

Few years ago I evaluated FreePascal for writing desktop applications, I ended up instead with the BASIC dialect PureBasic (albeit commercial), better out of the box experience. Pascal is also slightly more verbose than BASIC.


You could absolutely write professional software in BASIC.

As a kid I wrote my first "professional" product in Applesoft. My family used to rent videotapes (Betamax!) from a little video shop on the corner. They kept records of who rented what on paper cards -- they literally wrote in the customer name and phone # on a paper card and put it in a box. They would go through that box each day and see what was overdue, calling the customer to remind them to return. I was a curious kid so I asked the owner "how do you keep track of everything? Do you really go through all those cards every day to see what is due? Does anything get lost?" I told the owner "I can put all this on a computer and make it so you'll never lose track of who has what" and he said "tell you what, you create that and I'll buy it!"

I wrote a little program in Applesoft basic, using a light pen, that could keep track of the customers and video titles -- what was available, what was out, what was due, and what was overdue. The owner LOVED it! Customers had a "membership" card with a bar code on it.The clerk just had to scan the member card, a barcode for "checkout" (taped to the side of the screen) and then the bar code on the tape. No keyboard. Happy beep meant success. Check-in was the same process. And once a day they could check the "overdue" screen to see who to contact. The customer name, phone number, and name of movie was right there.

They weren't crazy about entering the names of all the tapes and sticking the bar code stickers on them, but once complete business was so much easier & faster.

It took me the better part of the summer to write this and I was paid $200 (a fortune for a 12 year old kid!) and given "free movie rentals for life." The owner was good to his word, and even after he bought a movie rental franchise many years later (maybe Blockbuster?) my family still enjoyed free rentals though the apple 2 was long since retired.


For the same reasons that a lot of people associate Pascal with either Borland's older Pascal implementations or a limited, academic Pascal, a lot of people today think of BASIC in terms of the BASIC that they had access to, and the machine that it ran on, which was a hugely fragmented experience - every magazine printing type-in code had its own listing for each platform.

And an older BASIC like BBC BASIC, which was one of the best of its era, still lacks some major conveniences associated with structured programs. It only gained record type support after Richard Russell ported it to modern platforms. The modern dialects are great, but diverge quite a bit from the early ones in that they have become fully structured with OOP support.

What's ultimately driven people over to the current family of languages is that they are making programs that talk mostly to other software, not to I/O. When you add that constraint, you may end up using the language the other software uses for pragmatic reasons. The irony of that is that it really is totally arbitrary.


Anyone who thinks BASIC is useless or a kids game has never seen what can be done with something like HP Rocky Mountain BASIC.


We also had Smalltalk, Prolog (esp. Visual Prolog) or Lisp.


No thanks! The compilers were extremely slow. If you wanted to hang around all day waiting for your stuff to compile then that was fine but it wasn't a solution for most of us. Plus the majority of the market was 8-bit machines.

In the early 80s I wrote a lot of code in (BBC) BASIC and assembly. That was extremely fast to build and run, with optimisations where necessary. For example to use a CP/M machine at the time with turbo pascal, it'd take at least 10 minutes to get the machine up and compile something simple. On the BBC Micro at the time, my preferred machine, there would be working code in under a minute. That was a big gain. They were used for test automation at the time over HP-IB (IEEE-488), eventually replaced with PCs running 16-bit Visual Basic on Windows.


>The compilers were extremely slow.

I'm sorry you had such a slow machine to put up with. You didn't get to experience the joy that was Turbo Pascal on an 8 Mhz 286 with 0 wait states. You'd get almost instant compile/run cycles. It was great!


... That's not early 80s though :-)


1984 ish


I do not remember anyone on the Mac's using Turbo Pascal, as there was a far superior offering named Lightspeed Pascal by Think Technologies. As I remember it, it would take Borland the better part of a decade to catch up to the level of Pascal IDE quality Think offered in 1986.


THINK Pascal and the THINK Reference was fantastic – instant compilation times, live debugging with one of the best symbolic debuggers I've ever used (complete with live statement and expression evaluation), concise and organized documentation of the Toolbox at your fingertips, complete with examples. The equal of Turbo Pascal on the Macintosh.

The Macintosh Toolbox had, early on, a reputation for being a little bewildering with a lot of new system libraries to handle (memory management, resource management, UI/TextEdit, DIY event dispatching... it seems strange to explain to people how we would have to manually detect menu events based on location, and then manually respond to menu events), but THINK Pascal gave you a lot of courage.


Borland never really cared to support Turbo Pascal for Mac, from what I've read. Most of TP popularity is from the PC market after all


Somethink I never think about: "It wasn’t always that easy to develop a program and run it on the same machine"

Really? Of course I'm not talking about embedded system. But PC in the 70s/80s, like Apple II, ZX-80, TRS-80 etc etc.

I guess we, 90s kids, are too spoiled :)


I've been reading the ZX Spectrum Visual Compendium from Bitmap Books (https://www.bitmapbooks.com/products/sinclair-zx-spectrum-a-...) and it has a lot of developer anecdotes in there. One of them (Clive Townsend) was talking about their development process:

"The original Saboteur! code was written on an actual Spectrum so everything - the source code, assembler program, and object code - had to be in memory at the same time. I'd work on a routine, compile it, then save it. To test it, I'd reset the Spectrum and load the newly compiled routine along with other necessary code, graphics and data. After testing, I'd reset the Spectrum again and re-load my source code and the assembler program. All from cassette! Nightmare!"


Sometimes you could have something plugged into the extension port to snapshot the memory, so you can switch easier, AND not lose your changes. It sounds horrible NOT to have it when writing a game like that (I love Sabouteur! Spent more time playing the 2nd part :))


I learned BASIC and Pascal, and bounced off of pointers at C (went to Perl instead). I probably wouldn’t have learned Pascal without my first experience programming on a Pc that booted into BASIC.


Interpreted BASICs (such as BASIC-PLUS on DEC, MS-BASIC on micros) big benefit at the time was speeding up the develop cycle by working with source code in RAM, and running directly from it, thus skipping the edit/load cycle across crushingly slow I/O.

Turbo Pascal on CP/Ms breakthrough was bringing a similar experience to a compiled language. So you got fast turn arounds, with better performing code. Turbo did this by compiling from RAM, to RAM, and skipping the link process. The Turbo runtime was already loaded in RAM and used by Turbo itself, so the binary just called already RAM resident routines. When the final COM file was saved, it simply bulk copied the runtime to disk and attached the compiled code at the end. This is one reason why even small programs created large COM files with Turbo, they all bundled the entire runtime.

There's nothing particularly slow about a C compiler, but when you wrap it in reading from disk and writing to disk, or, even worse, writing assembly that then had to be assembled, then stacking on a link step, and turn around craters. The practice of several C files having to be linked together involved yet more I/O. Ostensibly there was a benefit of not having to recompile all of the files all of the time (but note, especially on CP/M at the time there were no timestamps on files, so "make" wasn't a real option to do this), hoping for a net win. Linking also let your program include only the routines it truly needed, reducing footprint. Linkers have lots of benefits, performance during the dev cycle is simply not one of them.

As you started to write larger Turbo programs, utilizing include files, and having to write to disk (because the RAM wasn't enough for your code any longer), and Turbo most certainly slowed down. But still not quite as bad as something like C as, again, there was no real linking phase. I have, I think, a 1200 line Pascal program, in several include files, that builds on a 4Mhz (simulated) Z80 and, yea, you can sit there watching the lines tick by. Doesn't take minutes, but it's definitely a time to get a sip of coffee and catch breath looking out the window.

The Atari series had ACTION, which was a very Turbo like system using a simple Algol-esque language. It, too, had a bundled runtime on the ROM cartridge, with a very fast compiler. I tried a C compiler on the Atari, and tossed it after my first build. It was absolutely horrific. The Atari (along with the Commodores) had notoriously slow disk drives.


On the Atari ST (and the Amiga I am told) there were several Modula compilers available. You can download the source of the Megamax Modula-2 compiler from Thomas Tempelmann's homepage:

http://www.tempel.org/files-d.html


Logitech (yes, the mice/keyboard company) also had a Modula-2 compiler/IDE for the PC as well in the mid 1980s. It supported the mouse, unsurprisingly for Logitech, even though mouse support on DOS wasn't that common yet.

http://www.edm2.com/index.php/Logitech_Modula-2


I had Personal Pascal and Megamax Modula-2 on my ST. Great software. But most people used GFA Basic, which was structured, compiling, BASIC with a Pascal sheen to it.

Or, if they were serious, they used C. The API for the AES and VDI was developed with C in mind, because it was written in C. Void pointers all over, stuff that Wirth-languages despised.

GEM/TOS bindings for other languages were terrible, I never got Modula-2 to work well for GEM development, it simply didn't ship with bindings for it. I didn't have a C compiler, so to do anything GEM-ish... it was GFA.

But TFA is about the early 80s, and is out to lunch. Only professionals with expensive PCs had access to C and Pascal in the early 80s. Though I guess there were Pascal dialects in the Apple II world, they weren't that commonly used.


Diving into the history of higher level programming languages in the 80s, with most focus on Pascal, some on C, and a honorable mention of Forth. By continuing the spirit of "how come things we have today are the way they are", the article aims to answer why was Pascal so influential, what made it so popular, when did we get C and how was it different.

It's also a follow-up to an article appreciated on Hacker News that focused on how original BASIC compared to competition (and why The Famous Quote isnt' adequate to the BASIC most of us may know) - https://news.ycombinator.com/item?id=38743062


Forth was a very natural transition if you came into programming from the RPN style HP41cv calculator, an iconic machine for it's time


This looks like a fun historical view of programming in the 70s and 80s.

The section about C, though, I'm sorry to say is littered with flaws and should be rewritten and/or proof-read by someone with more knowledge about the language. Not impressed.


Indeed. I noticed "Going further, as was a common convention in assembly to pass the return value in the A (for 8-bit, later AX for 16-bit, and EAX for 32-bit) register of the processor (because it is faster than by a value in memory),". That is not the reason. While on many architectures, except perhaps 6502, it was faster (before we had first level data caches as fast as registers), the reason was reentrancy.

And the Z80A in an Amstrad CPC6128 was clocked with 4MHz, not 3.5MHz like a Speccie. (sorry, couldn't resist)


> And the Z80A in an Amstrad CPC6128 was clocked with 4MHz, not 3.5MHz...

It's quite common to be mistaken though, because the Z80 in the CPC was not running at full 4 MHz speed (even though it's clocked at 4Mhz). The gate array would set the Z80's WAIT pin for 3 out of 4 clock cycles to create a window for the gate array to read pixel data from memory. This causes the Z80 to fall into a 4-cycle-aligned pattern, meaning that memory access machine cycles were stretched from 3 to 4 clock cycles, resulting in an effective speed of somewhere between 3 and 4 MHz (depending on the instruction mix, but probably averaging somewhere around 3.5 Mhz).

> ... the reason was reentrancy.

...wouldn't passing the return value on the stack also allow reentrancy? Passing args and return values in registers instead of on the stack would definitely be faster, and on 8-bit CPUs, every clock cycle mattered.


> ...wouldn't passing the return value on the stack also allow reentrancy? Passing args and return values in registers instead of on the stack would definitely be faster, and on 8-bit CPUs, every clock cycle mattered.

But stack is memory. At least in these systems. So I think the problems could rise if we would throw concurrency into the picture, where you also have to synchronize the memory access, maybe?


Switching to a different "thread" would also switch to a different stack though (although that would be kinda tricky on a 6502 where the stack is hardwired to the $100 page). Actual multithreading on an 8-bit CPU is a bit of an esoteric topic though, usually one only has to deal with interrupt service routines running concurrently :)


You're already using the stack for parameters without compromising reentrancy; using the stack for the return value is trivial. Just like parameters, it's easier and faster to use registers where possible in those old systems.


A bit strange to mention Lazarus as if it was a thing of the past ("could never match...") but then also mention a few lines below that the last release was a week ago (almost two weeks now). It looks to me like the project is reasonably active? https://wiki.lazarus.freepascal.org/Lazarus_3.0_release_note...


It's more of a statement about life of not-mainstream open source projects vs what companies with many dollars can do. From the two, it's Delphi that's dead to me ;-)


I was a little different. After a couple of years or so on BASIC from 1979 onwards, and a very tiny dabble with Pascal, I spent the next four or five years on assembly, both Z80 and 8086. This was followed by a couple of years in COBOL, before I graduated to UNIX and C, where I've been comfortable ever since.


>> Late 70s and 80s: forget BASIC, we had Pascal and C

Well for once, PCs were ridiculously expensive back in the day so that rules out using the PC version of Borland Pascal / C. I had them in school, but we had mostly CP/M based machines and only two PCs. A 286 AT and an 8086 XT. If I recall correctly we had Borland Pascal 5.0 and it worked blazingly fast on both machines. Then some dumbass decided to upgrade to Borland Pascal 6.0 and wiped out 5.0 from the XT machine (lack of space, maybe? but mostly supidity) and replaced it with 6.0. It was horrendously slow. Like I got my first look into today's "enterprise" compile times :) Pressing F5 (or F9 or something to compile, I don't recall the exact key) took about a minute to start showing the compilation dialog. Several more minutes to compile. Compared to instantaneous on 5.0. Rendered the whole thing unusable on the XT.

On the other hand I hated working on school's computers. They were overcrowded, too few for too many students and I always had barely enough time to type the examination programs (I was studying computer science) needed to get my grades, never enough time to fiddle and experiment. Or game.

So for that I used my dear home computer, a ZX-Spectrum compatible clone manufactured locally: http://retroit.ro/wp-content/uploads/2019/12/hc-85-maro-678x...

Initially I did BASIC but as time progressed, I needed more speed and interpreted BASIC was hopelessly slow. I wanted to program TETRIS actually :) So I switched to Pascal, more precisely HiSoft Pascal for ZX Spectrum. Here's the book I learned from, there was no Internet back then so the book was quintesential: https://imgur.com/a/qweDG2E

Actually that's also where I learned and understood memory management. Linked lists, allocation, deallocation. Before C, for me there was Pascal. One thing that blew my mind and didn't understood it until I programmed and ran it with my own hands was recursively referring to the defined structure ... in the very structure. You know the C definition:

typedef struct node { int val; struct node * next; } node_t;

How the heck can you refer to something that's not yet defined in a way, I was thinking.

So, I eventually implemented that TETRIS in Pascal and I've no idea how I did it without a debugger... I guess with print statements and just thinking with pen an paper. Worked well enough although it did crash on occasion and I never figured out why :)

Anyhow, sometime later I got my hands on HiSoft's compiler for BASIC. Very cool thing, you'd load the compiler from cassette then you'd write your program in the regular ZX-Spectrum BASIC and then RANDOMIZE USR <some number> (address of loaded compiler) and the thing would put machine code somewhere in the remaining memory and provide you with the address and size. So you could save the machine code directly on cassette and run it directly next time. And all that in less than 40Kb of memory for BASIC code, compiler code, resulting compiled program and stacks etc. Believe it or not, I rewrote that TETRIS in BASIC and was able to compile and run it. Bonus it never crashed, I guess the BASIC version was inherently more stable.


The linked Forth article, is a good read - I haven’t seen it before, either!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: