Turbo Pascal was an IDE and a blazingly fast compiler. In 1986, I ran it on my 64k IBM PC (8088 4MHz). It was like nothing else I had experienced previously. Prior to that, I had developed software in grad school on punch cards. I occasionally had access to a PDP-11. I don't remember what editor I used, but it certainly wasn't an IDE. It may have been line-oriented. At work, we had a VMS system, and I had access to a C compiler and my first emacs. Not too bad, but nothing as immediate and responsive as Turbo Pascal.
Viewed another way, Turbo Pascal recaptured the incredibly rapid edit/run cycle that I first experienced with a PDP-8M running BASIC. Only now, I had 64k instead of my shared of 12k shared among a maximum of four users, an IDE instead of a line editor, and a far better language.
Turbo Pascal also blew away anything else available on the PC. I had a C compiler from Microsoft that was far more expensive and far slower.
Turbo Pascal was a truly magical piece of software. I remember that I had some initial skepticism about it. As I recall, it took some liberties with the language. But once I finally used it, I was instantly converted.
IMHO, nothing has quite captured that experience of development until IntelliJ, starting with 4.x or 5.x.
There's a dedicated group of troopers who have kept the flame alive. It's cross platform. There's an Android port in progress. There's a JavaScript backend. The IDE is like all the good things about Delphi 7, none of the drawbacks.
Yes! I use it and really enjoy it so far. There are obviously some people using it for _very_ advanced work in medicine and the sciences, to see some of the community posts and shared libraries... Personally I think I'm building yet another text editor (starting with a text viewer) but the experience has been great.
Definitely the Lazarus Forum as others have noted. In my opinion, it's a really delightful time capsule of how internet communities used to be before Facebook/Reddit. Insanely helpful and knowledgeable members that will answer any question, a bunch of eccentric characters, plenty of n00bs asking mundane questions, and the compiler developers themselves. Probably my favorite language ecosystem.
All you have to do is enter delphi in search box of github and you get whole pile of projects. Alternatively you can enter pascal instead of delphi. this agail will get you pile of projects albeit smaller. If that is difficult I am not sure what easy is.
I agree with brnt, while Delphi/Object Pascal seem to be quite popular according to the Tiobe Index, https://www.tiobe.com/tiobe-index/delphi-object-pascal/, I find it hard to find open-source projects when comparing it with other programming langs. Example:
* Delphi Programmers do not like to showcase or open-source their projects?
* Delphi/Pascal is used to create mainly proprietary software?
Either way, I think that for Free Pascal to prosper in the long term, something should change. Perhaps more open source projects published on Github/Gitlab.
FreePascal/Delphi are used by large number of developers. I do not think there is a need for them to prosper in a sense of high Tiobe index. There are way more obscure languages some pretty big companies are using for their internal stuff without much ado about popularity.
You are confusing "hard to find" with "less". There is way less projects on github (the original question specifically mentioned github). However 9,000 is still a big number and one can find virtually anything needed from a practical standpoint. On top of that both delphi and lazarus come with huge system libraries.
C++ for example dwarfs delphi on github. However when I am doing C++ I am looking for very few projects in very specific areas (network, file management, db connectivity etc). Suddenly your choice becomes not that big when you separate all the noise.
Also whole bunch of delphi projects are located elsewhere due to various reasons. Hence my advice to ask Google.
Finally delphi projects tend to be rather large and high quality, check this one for example: https://github.com/BeRo1985/pasvulkan or other repos from the same guy.
Even though I was using Turbo Pascal/Delphi/Lazarus since 80's I am not really part of community. I do not think you'll find a single post of mine. So do not take me as an example.
I have to admit being a prick in this particular matter. Not being able to find subject when search on Google, Githab etc reveals a ton of info I think says something.
One extra thing. I do use Linux and find it very useful. Do I need to abandon it because Linus is not a shining example of polite person?
I think on here you can safely assume I have tried to Google, and do a codesearch on Github. Like another poster shows, certianly comparatively, you won't find much. That which I did find is nearly always Delphi (which does not translate to Freepascal, the only platform that would be interesting now I'd say). Very few complete and modern(-ish) repo's, rather than someone throwing their 15 year old Delphi project online.
I've done some Pascal coding, and this lack of libraries or projects is for me a practical impediment. Freepascal (and Delphi) come with very complete stdlibs (if they are called that), but I'd not be satisfied with merely Python's stdlib either. On top of that, string handling is imho very uncomfortable in Pascal, feels like nothing changed much in 20 years. I found a recent Freepascal university course, but source files with 10kLoC do not make it helpful to get started.
So, while the Freepascal forums seem helpful, it's a meager resource if that's basically all the community is.
I don't. Whoever thought recompiling Lazarus whenever you need to simply add another component to its pallet really lost the spirit of Pascal/Delphi on that one.
It was done because of technical limitations - the compiler didn't support dynamic packages (collections of dynamic libraries). This was again due to technical limitations from the compiler having to work in multiple OSes (i'm not 100% sure on the detail) and how they share and export symbols from dynamic libraries, something that Delphi didn't have to since it only had to work with Windows and only with their own libraries. FPC and Lazarus are also technically separate projects with their own release schedules and Lazarus has to work with multiple versions of FPC that make things a bit harder. This made introducing dynamic packages very hard... and since there wasn't much of a benefit for most uses (recompiling Lazarus is almost instant after the first compile and even that takes less than a minute even on old hardware), it wasn't something that FPC developers focused on much.
Having said that, the next version of FPC will support dynamic packages. They had to change some things in how RTTI is stored, apparently, to accommodate the OS differences (this may break some programs that use the RTTI structures directly, though the fix is trivial and 99.999% of programs wont be affected - even some of my own code that does touch RTTI directly works fine in both the latest stable version and the development version).
However chances are Lazarus will still keep using the recompile approach for some time since it'll need to work with FPC 3.0.x... and also it will take some time to update all packages to work dynamically.
Anyway, what i'm trying to say, this was done due to technical issues and compiler limitations due to what FPC and Lazarus have to support and remained an issue for a long time because in practice it didn't make much of a difference. However it will soon be solved.
" Delphi didn't have to since it only had to work with Windows and only with their own libraries"
Delphi has for some years the possibility to work on more then just Windows, and still is a breeze, regardless if I targeting Android or Windows, to just "here is the package, put the component on palette" with just 2 clicks.
I really hope Lazarus will come home, like you say, with this, since Delphi is not targeting ARM in Linux, while Lazarus does. We'll see what future holds, as I have no problem using one or the other as long as the compiled project works on its intended target. Also regarding RPI, I can just install Android on RPI4 and Delphi will have no problem targeting that
AFAIK Delphi on non-Windows is very different than Delphi on Windows, exactly because of issues (and AFAIK it even used FPC for a little for non-Windows targets, though this was eventually replaced with their own code when they started using LLVM). They were perfectly fine with having differences between platforms at the core level, however this wasn't what FPC developers or Lazarus developers wanted - in general they try to have things operate as similar to all OSes as possible and they also support way more targets than Delphi, which complicates things.
In any case i was referring to Delphi in general from the start. Delphi had support for dynamic packages since version 1, but for decades they only supported Windows. FPC on the other hand had support for multiple OSes before even version 1.0 was released.
TBH i do not remember the exact details, but IIRC it had to do with memory management, symbol resolution and RTTI support across DLLs (e.g. being able to do something like "Foo is BarClass" in DLL A, Foo is an object instantiated in DLL B with a class that is only available on that DLL B but descends from a class defined in the main program whereas BarClass is a variable holding a virtual class defined in DLL C that also descends from the same class - essentially tying to merge in RTTIs from different AOT compiled modules where none has a complete view).
And of course the fact that it was a low priority issue since there wasn't any major practical drawback (personally the only issue i had was that i couldn't share the RTL and LCL between executables that i'd otherwise package together, meaning that the produced executables would take more disk space than necessary since each executable is having their own copy of the RTL and LCL instead of using them from a shared object/DLL).
Yeah, Delphi is still trying to find its compass when dealing with cross-platform. Intentions are good, just execution lacks. Problem is currently none of them (either Delphi nor Lazarus/FPC) are a good fit overall. Each solves problems on different platforms differently while having other problems that are non-existing for the other.
I do have fun playing with them though, trying to learn each what quirks lurks around. I mean one example is above I said - Delphi can target ARM if it's Android, but can't target ARM if it's Debian. So I can run on RPI something Delphi cooked if I have Android there, but if I have Raspbian I can't. Same hardware, different OS, such a hard nut to crack :). While if it's Intel and you have Linux, guess what - Delphi owns it.
So for now I am switching between IDE's/Frameworks all depending on availability of technology and most importantly what client(s) want.
Interesting. I haven't needed to do so, but I thought I had heard that was a pretty quick recompile, on the order of < 10 seconds? I know there are some tips to speed it up.
Except when you need to, it's not working, or creepy bugs won't let you do it. Do try it on Lazarus for Raspberry Pi for example (you can do it on a virtual machine if you lack the real device). You'll see what I mean.
Are you using the official releases? If you are using Lazarus from Debian or some other repository, it will not work due to these packages not setting up permissions for the executable to be replaced.
In general IME on Linux the best approach is to install FPC (either from repositories or, better, using the official release) and then download the Lazarus source code and compile it somewhere in your home directory with 'make bigide'. This will avoid any problems with Lazarus packaging that often tends to be broken (again, IME).
I used in home instead in it's official release. Also I sudo-it when launch it, so it will have full rights. On some packages it didn't work, no matter what I did. I had to use official apt from repositories for those packages (lucky me they had that).
I'd love to hear a bit about your experiences with Lazarus.
A few years ago I tried playing around with it on macOS, but all I remember is that I wasn't able to get things up and running because I kept encountering problems. Maybe I was doing something wrong or I didn't invest enough time in setting up and learning how to properly use the tool. I think I tried using Homebrew to install everything during my previous attempt, but since I'm not seeing any mention of this option as part of their macOS installation instructions [0] then I'm guessing that might've been the source of my problems.
Would you still recommend learning Lazarus and FreePascal in 2020?
I've mostly done web development, but I'm also interested in traditional native application development. My hypothesis is that learning to use some of these older tools or systems can teach you a lot of lessons which carry over across languages and platforms. For example: learning a bit of Qt helped me refine my internal models for various UI widgets. I want to learn to write faster and higher quality applications than our current highly-frustrating standard.
Happy memories! Including my first programming contract.
In high school, I ended up doing telephone survey work, mostly market research. We had to fill out these terrible timesheets so that clients could be billed for each call. Poking around, I discovered that the office had a PBX [1] to run all our phone lines. It was hooked up to a PC, and in the manual I discovered it would log every digit dialed, plus other events.
I talked my bosses into letting me mess with it. And then I convinced them to pay me some outrageous sum ($500, I think) to write Turbo Pascal software that would generate an automatic report every night replacing the timesheets. Given that I was making something like $5/hr (well above the $3.35 minimum wage), it seemed like a great double victory.
Single-pass compilation requires forward-declarations (or the author to rearrange their definitions), it would also prohibit things like splitting a type definition-up between different source files. That said, a simple two-pass compiler where the first-pass is to simply _index_ each discrete compilation unit would work (and such an index could be written to disk with no need for memory usage) - this would probably enable _most_ of the functionality of a C#-like language - heck, you could probably also add a trivial third-pass to implement generics (albeit as C++-style templates rather than type-safe generics) with compile-time code-generation (again, to-disk).
GCs aren't new either - going back to John McCarthy's 1959 implementation in Lisp - which makes me think a non-optimizing AOT C# compiler and stop-the-world GC could probably be implemented with minimal memory usage (actually, isn't this how the .NET Micro Framework worked?)
Looking at my current .NET project I'm working on - the actual compilation of about ~800 C# source files to CIL takes less than a couple of seconds - the rest of the build-time is taken-up by asinine build-steps in inherited MSBuild .targets files that I don't have any control over (one step involves copying all of the source files and assets - to a separate pre-build directory twice). Methinks problems with computer efficiency exist in non-obvious places.
Note that you can avoid a second pass by doing 'backpatching' - essentially assuming whatever code you encountered was correct, emitting the relevant bytecode/machinecode with enough reserved space for an address (e.g. for a 'jump to address' instruction, emit zeroes for the address), keep a list of 'unknown stuff' and then resolve them once compilation is finished.
Many years ago i wrote a Java-like scripting language (classes, GC, etc) with that approach. Funny enough, at the time it didn't occur me that i could have separate bytecode streams for methods, etc, so the VM was almost like a custom CPU (the only non-CPU-like feature was an instruction to allocate memory) and since it generated bytecode while it was parsing the source code itself (even skipped any separate tokenization, i parsed the character streams directly) while it also allowed code to be placed outside of functions or classes (it was a scripting language after all), every time a class or function was encountered in source code, it'd emit code to jump over it - so the final bytecode would start by a series of jumps over the functions and classes before arriving to the first actual instruction :-P.
”Single-pass compilation requires forward-declarations (or the author to rearrange their definitions), it would also prohibit things like splitting a type definition-up between different source files”
You probably know, but for those who don’t: Pascal requires both. It’s not as if Turbo Pascal implemented a subset of the language.
Also, multiple passes is less important nowadays. In early times, memory was so tight that each pass wrote its output to disk or tape. Early FORTRAN compilers had many. See for example the ‘sections’ in https://archive.computerhistory.org/resources/text/Fortran/1.... Depending on how you count, that’s 6 to 9 passes. (Or one. That document says “With one exception, Fortran may be considered as a one pass system. That is, it looks at the source program only once, and it makes a scan of each statement once only. From then on, references are to tables only.” I think that’s bending th truth, though)
Nowadays, intermediate results can be kept in RAM. Clang, for example, can runs tens of passes, some of them multiple times. That still slows things down, but less so.
Some modern languages do the equivalent. Thanks to lazy evaluation, you can have a separate lexer, parser, and code gen, but still have a single pass as long as you have an upper bound on backtracking.
Pretty sure it was the world's first IDE, not just an IDE. In a single .exe file it had a WordStar compatible text editor, compiler, linker and libraries.
That's pretty much just like a modern window-based user interface... in 1980!! It has overlapping windows, windows can be minimized and moved around, context menu via right-click, spreadsheet-like data... truly amazing.
AFAIK the first "IDE" was the one in the original BASIC, or at least the first one to called that. It was command-line based (no fancy GUIs back then) but it allowed editing, debugging and compiling (despite BASIC dialects being commonly interpreted, the original BASIC was a compiler) from the same environment.
I have to admit that I still judge programming tools, and in fact, entire computers, against the beloved BASIC prompt. You flipped a switch, got a BEEP, and 3 seconds later, you were in heaven.
Today the closest thing for me is firing up a PC and opening a Jupyter notebook. But it's certainly not the same.
I remember wondering how on earth you could write Pascal programs without line numbers since that was the difference between running a command and entering/editing a program in the Apple ][ environment. I don't remember if Apple Pascal had what would qualify as an IDE now, although it had its own OS which made it less attractive to me.
> I remember wondering how on earth you could write Pascal programs without line numbers
You use a program called EDLIN that writes the line numbers for you, and even renumbers your lines automatically whenever you have to insert something in the middle of your program. It's really neat!
The fact that there are tiny shops like Xojo that make completely proprietary and niche languages and compilers show that there are developers that are willing to pay for their tools.
That's funny. I was using IntelliJ IDEA at version 3, and it was an amazing experience. With version 4 they made changes to the code parser that made it slow and incredibly unstable. It's been improved since then of course but in my mind they still haven't gotten back to the speed and efficiency that they had 15 years ago.
I may be off as well, and we're probably talking about the same version. As far as I remember it was version 4 that was the slow and buggy one.
I was trying to find out when these versions were released to see if I could line it up with the projects I was working on at the time, but I can't find a list of major releases with dates.
It’s such a nice coincidence that your path matches mine as well, going from Turbo Pascal to IntelliJ.
When I started on Java, I was disappointed at the quality of IDEs so much that I resorted to coding in Notepad instead. I was so used to the seamlessness of Borland IDEs. It was only when I did a 30-day trial of IntelliJ that I started using an IDE again.
I doubt anybody ever bought a 16KB tape-only IBM 5150 to hook up to their TV, but it was indeed the entry level configuration. It probably just existed to make the machine's price look good - even today all articles quote that configuration's price of US$1,565 (equivalent to $4,401 in 2019) instead of the $3K or more a practical PC would cost.
The PC could only be expanded to 64KB on the motherboard. Anything beyond that required expansion cards, each adding at most 64KB initially (which opened the door for larger third party upgrades from AST and others). With only six slots and separate cards for the floppy disk controller, video, serial and parallel ports the largest practical configuration was initially only 192KB. Not that the software could take advantage of all that (in the case of .com stuff quickly ported from CP/M, things improved with .exe).
When most people try to remember the original IBM PC they normally think of the 1983 PC XT (IBM 5160) instead which had more slots and more memory.
Around 1989 I was asked to fix a Basic program that tested car shock absorbers. It took me less time to write a new version in Turbo Pascal than it would have taken for me to completely read the source of the original program. The IDE made a huge difference but the speed was also important since it allowed me to modify and try several times per minute.
Wrote a program to manage the sale and rental of video tapes for an small business. I wrote it in 1984, using Turbo Pascal. The last time I checked the store was still running the same software in 2005.
The application was table driven, all important configuration items were stored in a text file as a key-value pair. It had a bar-code scanner and was tied into a cash drawer. Receipts were printed on small dot matrix printer.
It did. Like I said everything was table driven, with key value pairs. The owner opened up the text file, and added a new rental type, DVD. They also started renting VHS players, and DVD players, just added as new categories.
If the sales tax ever changed, the owner could easily make that change as well. It had a built in rewards program, specials, sales, all editable in the text file.
Once the software was delivered I never touched it again.
I'm curious how large the actual components delivering that page are. I suspect they're significantly larger. HTML output (in this case) is a product of a program, not the program itself.
> And the whole thing was lightning fast, orders of magnitude faster at building projects than Microsoft's compilers.
This particular comparison is interesting when you consider that the primary author of Turbo Pascal, Anders Hejlsberg, has been at Microsoft working on C# and TypeScript for quite a while.
I suppose we could call Kahn the primary author of Turbo Pascal v1 if we exclude the actual compiler from the definition of "Turbo Pascal", but I'm not sure why we'd do that, especially in the context of this article. He definitely deserves credit for establishing the product category of "credible off-the-shelf IDE for commodity microcomputers", though.
Since many of the problems we have today to solve were solved in the '80s and '90s with tiny tools such as Turbo Pascal, we can conclude that faster hardware and lots of memory are allowing us to bloat the software by using layers upon layers of abstraction, SOLID and DDD principles, virtualizations inside virtualizations inside virtualizations, emulators running on emulators, totally inefficient interpreters.
Using Python with its Global Interpreter Lock seems like a total antipattern when it comes to use resources wisely.
Well, some things are enabled by more involved software architectures that would not have been realistically achievable back then. Turbo Pascal was limited to only showing the first error it encountered when compiling your program. It did move the cursor to the location of the suspected error. It wouldn't have been able to generate error markers in th e background while you type. Autocompletion would not have been realistically achievable either.
That doesn't mean that lots of programs aren't bloated by overly complicated internal designs. Sometimes this is a concession to the development team structure. Sometimes this is incompetence. It is hard for me to tell from the outside.
TBH in practice i never found showing more than one error useful since way more often than not these errors are either related or the first error also causes the other errors. I only find it useful when the compilation process is very slow, but i try to avoid slow compilers whenever possible and always just recompile whenever i fix the first error. Most IDEs move your cursor to the error position so the build key also acts as a "move to next error" key :-P.
It depends on the language you are using. Sometimes it can be useful to see more than the first error message to understand the problem.
I find all the squiggly lines and annotations that Resharper adds to the code quite helpful. These are a kind of low key way of indicating minor issues with the code. I find that appropriate for things like coding style violations because they often aren't an immediate concern because I want to get things working first. But they are a good reminder of the open cleanup work once that is done.
I think it depends on more than just the language... i find the squiggly lines very distracting and brings me a bit of anxiety, like someone being over my shoulder looking at my code and going "hey, you forgot a semicolon over there" and i'm like "YES I KNOW, i just noticed something else that i want to edit a few lines above, i'll fix it, leave me alone" :-P.
I don't have any problems with that at all. I confess that I have a kind of a disconnect with people like you who seem to be easily distracted by this. I don't understand it. I am not thinking that you are wrong or anything like that - it's on me. This is mostly my confusion talking right now.
Sometimes I wonder whether this is somehow related to a quest for clean/simplified (or, as some might say, dumbed down) user experiences in other situations. Also, is this in general more of a thing in America compared to the rest of the world?
Yeap, but note that modern hardware also allows simplifying the code considerably, thus making it easier to avoid bugs. As an example, Turbo Pascal used to have an addon called "Turbo Toolbox for Databases" (or something like that) that described using a simple database API to handle the customers of a dentist (that was an example) stored as B+ Tree indexed records on disk. This added a lot of complexity which was necessary due to the memory and disk limitations of the time.
Nowadays, the same "database" in Free Pascal can be achieved with a simple "array of TCustomer" type that keeps everything in memory and is loaded/written to disk with a couple of lines of code. Computers are so fast that unless you have every single citizen of a European country as a customer, you can simply perform linear searches directly in RAM without touching the disk. If anything this approach would also be much faster than if the Turbo Toolbox approach was used on a modern PC.
Of course a lot of software is very inefficient but TBH i do not ever expect this to change since people will always value features and convenience above performance - even when they do not use said features themselves.
It's even more interesting when you consider that it ran under MS-DOS, which had no dynamic linking --- it's a statically linked binary.
The touch command under OS X Lion (44,016 bytes).
...and I bet that one is dynamically linked.
I believe the reason why it's so fast is because it has no optimiser, and generates code as it parses. Pascal is also one of the easier languages to compile.
Here is touch.c [1] from macOS. The only change since Lion is the addition of the __used attribute on a couple of consts at the beginning. It's very similar to the FreeBSD one, though they've diverged slightly in the past 2 decades.
It supports 8 different flags (plus an undocumented -? flag), with support for using absolute or relative times, setting various timestamps on a file, and options for dealing with symlinks and permissions and such. More features than I'll ever need, but not outrageously so.
On Mojave, this file is only 23392 bytes, so it's no longer bigger than Turbo Pascal. Plus, automatic filesystem compression shrinks this down to only 7340 bytes on disk. Since blocks are 4096 bytes, this is actually the second-smallest (non-empty) file!
Turbo Pascal programs relied on separate on-disk .BGI files that contained object code and effectively implemented a kind of dynamic linking (albeit in a single-task context). It was possible to directly link them into the final executable, but this wasn't the default.
TPUs are closer than BGIs, but still aren't dynamically linked. They are object files which are statically linked into the final exe.
It's possible that the author is thinking of pre-TP4.0 chain files (which were used to get around the 64K memory limit of .COM format before TP switched to generating EXEs.)
What is all the weight in that "touch" binary? It can't be code. My guess would be that the format is just not optimized for size and contains a bunch of boilerplate records whose minimum size is in the kilobytes.
Turbo Pascal is a language I've never used professionally but always has a special place in my heart. I grew up poor without any kind of a computer but always considered computers "fun". As a high school senior (1995) I took what I thought was just another computer class; they were teaching Turbo Pascal, and I immediately knew I discovered my life's passion.
Same for me. I learned most of my programming in high school in our computing classes using Turbo Pascal. I would finish the classwork in a manner of minutes and spend the rest of the time writing games and other software. Our teacher didn't mind us playing games as long as we wrote it ourselves.
I still have 2 post-it notes on my second monitor from when I decided to finally deep dive into the Android Framework.
One says "1988 Pascal", the other one "2018 Kotlin". I wrote those after noticing how similar the syntax is for variable declaration and it felt like going back to my roots.
The sheer joy of knowing your IDE, the language and the target platform all in one interface reminds me a lot of the days staring at the blue screen of Borland Turbo Pascal in the early 90s. I really missed it in all those years of web development I did afterwards.
I've never used Turbo Pascal, but I've used UCSD Pascal, which was also pretty small. In a college computer graphics course in around 1980, we ran UCSD Pascal on Terak 8510/a machines, which had a total of 28K 16-bit words (56K bytes) of RAM.
The Terak was an interesting machine:
> The Terak 8510/a of 1976 or 1977 was among the first desktop personal computers with a bitmap graphics display. It was a desktop workstation with an LSI-11 compatible processor, a graphical framebuffer, and a text mode with downloadable fonts. Despite the lack of a MMU, it was capable of running a stripped version of UNIX version 6. It was the first personal machine on which the UCSD p-System was widely used.[1] Various universities in the USA used it in the late 1970s and early 1980s to teach Pascal programming. It provided immediate graphic feedback from simple programs encouraging students to learn.[1]
Also, in those days, we were running Version 7 Unix on a PDP 11/45, which supported several concurrent users in what would now be an unimaginably tiny 256K of RAM.
While today's code does a lot more, and has to deal with everything from internationalization to accessibility, I have to believe that we (I was around programming in those days!) were better at computer programming back when computers had memory measured in kilobytes than we are today.
I still take on jobs to do embedded programming in assembly language on PIC or similar processors and my clients are usually amazed at what can be done in tiny amounts of code.
I think part of the fun in the "old days" was that you could familiarize yourself with your tool chain in its entirety. I mean, I knew every command in Turbo Pascal, and virtually every feature of the "operating system" such as it was. Even finding out information about algorithms was hard before the Internet, so I had to wring my own.
The microprocessor wasn't radically different from the 4 bit computer that I built on breadboards for my college electronics class, and I could tell you pretty much all of the things it could do.
Then the only limitation was you and your own wits.
Today, not only is there a language to learn, but libraries, and a framework, and an architecture, and a "stack" and revision control system... you can't ever master the whole thing and it's changing faster than the baud rate of my eyeballs. It's still a boatload of fun, but a new kind of fun, and sometimes I like going back to the old kind.
My first "professional" language was ColdFusion. I literally took the manual that came with the install disks home and learned it all in about a week. (1999)
Severe limits/restrictions can influence some amazing solutions. To me, this separates the real programmers from the people that just get paid to bang out code. We went to the moon and back on this kind of severe limitations, yet today, we can't make a website without a library requiring 100s to 1000s of dependencies resulting in a total download larger than the amount of data required by the entire moon missions. That's just for one page of useless internet. I just shake my head at the silliness.
> We went to the moon and back on this kind of severe limitations
Adjusting for inflation, NASA spent $283 billion going to the moon[0]. I can't be bothered to find a breakdown of how much of that budget could be directly tied to writing code, but I suspect that the investment was a wee bit higher than a rando frontend dev slapping together some libraries in 2020.
Not that I disagree with your point, but the computers used for the Apollo program needed a healthy amount of printed manuals and ground-based support to be functional. They were only a part of the computational stack required for Apollo.
I had so much fun with Turbo Pascal in the early 90's, I used it to create everything from GUI frameworks to ANSI bulletin board systems and keyloggers. Definitely the most creative environment I've come across.
Then came C++, and I wasted A LOT of time trying to wrap my head around it while creating nothing of value before giving it up entirely.
My Pascal experience also landed me my first job after university, which was to help write/maintain a system developed in Delphi.
I discovered Turbo Pascal in my preteens, after BASIC. Spent a few years with it and got pretty fluent. Interestingly enough, I remember also working on a bulletin board system, called something like WWIV. Now that I think about it, that was a seed planted for my current occupation.
Learned C after that, but drifted away from programming (for a decade or so) around when C++ came out. I came back around to computers when I wanted to build myself a static website - and I'm still in the wondrous rabbit hole.
These days, much of my time is spent reading and writing TypeScript. How strange after all these years, I'm again speaking a language created by Anders Hejlsberg. It's a joy to work with VS Code editor as an integrated development environment, and I fondly remember my time with Turbo Pascal.
Turbo Pascal was what the cool kids coded in. Although that sounds like an offhand statement the kids I knew that started out learning it went on to work at Id Software later on. A little before my time and I was jealous because QBasic was where I was at back then. Seeing that though gave me the courage to jump into Borland C++ not long after. It certainly made me realize there was a world beyond BASIC.
My Turbo Pascal story is weirder than most I think. When I was much younger, someone I knew was learning COBOL and the "editor" they were told to use was a pirated binary on a floppy. That binary was called tp3 and nobody paid any attention to it until I, wanting to also learn a bit about COBOL, decided to try it out myself and read the notice that the program output. Something about Pascal.
Long story short, I ended up looking into Pascal and using that tp3 "editor" to learn a bit of Pascal, which seemed a lot more fun than COBOL.
I was one of the 12 people who wrote a lot of Apple Pascal in the early 80s. One a day I got a call from a tech writer who was reviewing Turbo Pascal for Byte or some other programming mag. He had heard through the grapevine that I did Pascal and brought over an Osborne luggable and a copy of Turbo pascal for CPM. I fell in love instantly. Although I was a die-hard Apple guy at the time, the PC version of TP eventually turned me into a PC programmer overnight.
One other thing, it was so cheap and fast that people started using it as their text editor of choice. It would be interesting to know how many copies of TP were sold that never compiled a line of code.
I have written a 3D engine in Turbo Pascal. There was a castle you was able to walk through. It was fascinating because all calculation has to be done manually including figuring all the math to figure out where to plot a 3D point on a screen. Great times. Later i have moved to Visual Basic and OpenGL. Nice memories from primary school
Ah, Visceral BaySuck. I've been in places where all one had was MSOffice, and that VBA IDE was it.
I keep a private github repo called NVBC (NecroVisualBasiCon) with snippets of the all the madness that was trying to trick VBA into being a useful tool. The IDE was good, but the language itself was surgically precise in delivering 83% of what one wanted for a given task.
It was caused mostly by the way MSOffice access was delivered to VBA at that time. And maybe also by VBA itself. Cannot remember. Anywho. VB standalone was pretty decent. Nice you have the library. Its part of the history
I remember when we hired an intern straight out of college, and he had to do a task where one of the steps involved sorting a long list. He got so excited, and spent half a day implementing quicksort. He got so sad during code review when I showed him there was already a built in Array.sort method that was faster than his.
The BBC computers had a pascal compiler that was about 5K if I remember. It lacked quite a lot though, like records I think, enumerations & more, but had nestable procs/funcs so wasn't too far off the language in other ways. Never used it BTW.
One thing I liked about pascal but missed in every language since is the ability to have arrays start from arbitrary values. It can be fudged in C/C++ pretty easily but still, be nice to have it built into languages like java/scala/so many more.
Another nice feature is sets and ranges, so you can have something like
type Day = (Monday, Tuesday, Wednesday, Thursday, Friday, Saturday, Sunday);
Days = set of Day;
const Weekdays = [Monday..Friday];
...
if Today in Weekdays then ...
...
Sets are essentially bit flags but they look much better than a series of #define MONDAY 0x0001 #define TUESDAY 0x0002 etc. IMO, of course :-P.
Oh wow, I think I learned Pascal via Turbo Pascal. When I was 14 in 1984 I went to a Computer Camp at Virginia Tech and we used Pascal in this monolithic IDE sort of thing. Maybe it was actually UCSD Pascal.... We made a simple 16 room 4x4 dungeon with keys and doors. Each group writing the code for one room and them tying them all together. It came out not bad for a bunch of kids.
When I got my Amiga a year later I was happy that there was a Pascal compiler, then saddened that it only had stdio sort of interface. No way to open windows and draw things or the like. Luckily I found a FORTH that not only had a 68k assembler but also all of the constants and things you needed to write graphics and the bonus of making that single executable file.
Sorta sad I never really got to use Pascal after that. By university time it was all Assembly, FORTRAN and C for the CS classes.
> I think I learned Pascal via Turbo Pascal. When I was 14 in 1984
Ha, I learned it via TP when I was 14 in 1994! So many dumb hijinks in that class. If there was somebody you didn't like, one thing you could do was space out past the end of line 1 of the program and just write END; in the 100th column or so. That made everything else in the file just a comment. There was no word wrap, so unless they knew how to page right, there was no way they'd ever figure it out!
I also wrote a 2-player pong game, but I knew the keyboard handler could only handle 3 keys at a time, so if you wanted to temporarily disable the other player at a key moment, you could just hold down three keys and shut him out.
I toyed with Basic on Z80 as a kid but I have fond memories of Turbo Pascal.
When I was 14 and I first got access to a PC at school, Turbo Pascal was the first proper programming language I got access to. It got a compiler, strong typing, an IDE and a help page. Help page was invaluable since we didn't have Internet access nor did I owned Pascal books.
Pascal made me think anything is possible, from games to line of business apps. It provided low level access and I toyed with rebooting the PC and entering BIOS and bypassing BIOS password by writing to some hardware register.
My journey with Pascal didn't last much, as I discovered C and C++ which seemed even more amazing and flexible.
What version of TP? The linked articles is about TP3, which lacks common prog lang features that later versions definitely had (like proper modules or objects).
If you take a look at later versions or even early Delphi, I'd say garbage collection. Other languages from the Wirthian family had it, like Modula-2/3 or Oberon.
Lambdas/first class functions.
Unicode support.
Syntactic sugar up the wazoo.
Other than that, I think one could be very productive in a TP 5.5-ish language even today. The rest would be mostly about the infrastructure and community, like IDEs, package management and Medium posts with lots of memes.
Modern pascal implemented in Delphi / Lazarus IDEs has basically all essential features modern language can expect. It does not have garbage collector but I personally consider it an advantage. What it does instead is heaptrace. When it is on for debugging - upon shutdown of the program it will point you exact place/s in source code where you leaked memory should such leak ever occur. Madexcept product does the same thing and more for Delphi
The latest version drops ARC, which pissed me, but I guess it is better to have a sound history, instead of a schizophrenic one depending on the target platform.
Having memory management based on deployment platform I think is one of the most insane things Delphi's vendor had committed. Luckily they're backtracking on that. Anyways the decisions like that combined with maturing open source alternative Lazarus/FreePascal had led me to use Delphi only for maintenance. All new GUI projects are going to Lazarus/FreePascal combo.
I would say having an installer that depends on 3 other languages, using a particular libc without the ability to switch it up, unable to cross-compile to other platforms.
Memory! You could only make 64k chunks of code and had to connect them with overalls.
PS loved turbo pascal. Everything I know about oop I learned from the manual for tp4. First job I ever had programming required using it, but even then it was being replaced by C.
VGA-class cards could display a 80x50 text mode. Not sure that it was easily accessible in MS-DOS without assembly hacks, but it could be configured from within Windows.
I have always used this mode when running TP, without any difficulties. I think it was enabled by a simple key when launching .exe, but I don't remember exactly, which means it was something simple and inconsequential.
Yeah, there were some hacky things you could do, e.g. using software from Quarterdeck? Don't remember. I'd have to go back and look at docs I wrote from a program I wrote t the time. I seem to remember supporting some alternative screen resolutions in a an MS-DOS directory manager I sold.
What are you referring to as "intellisense"? I'm sure people usually refer to code-completion ("intellisense" is Microsoft's brand for it in VS) and AFAIK no Borland Pascal product before Delphi 3 had this. Delphi 2 certainly doesn't have anything like that.
I have both Delphi 2 and Borland C++ 5 installed on my main PC here since i'm using them often (they are very fast, i only use BC++ as a C compiler though). These two tools are also made with Borland C++ Builder 1 and Delphi 2 respectively:
Turbo Pascal was pivotal for me growing up. I often think back on the complete joy of creating, when I was first learning to program as a kid. I have been trying to arrange my life in a way so that I can get back to that feeling. You can see some stuff I did long ago here: https://github.com/ca98am79/my-first-programs
Written by Anders Hejlsberg https://en.m.wikipedia.org/wiki/Anders_Hejlsberg, who is also lead architect of C#. I was writing a pascal compiler at the same time and was rather blown away by what Turbo Pascal was able to do.
I wrote an application that we sold to clients in TP. I implemented a set of libraries to do forms based editing, a cooperative form of multi-tasking, and the ability to pipe text between those tasks. It was a lot of fun, and a bit mind bending to have a function return from the spawn call twice. 8)
In the 80s, had an internship at a government outfit with very high security. This was before the days when PC security was really understood. I could easily have run off with all sorts of very secret stuff. What I did run off with was a copy of Turbo Pascal.
I used Turbo Pascal back in high school (1992-93, AP computer science.) By that point, I already knew C and it felt like a step backwards in some respects. Still, the IDE was impressive and it was a lot of fun!
Viewed another way, Turbo Pascal recaptured the incredibly rapid edit/run cycle that I first experienced with a PDP-8M running BASIC. Only now, I had 64k instead of my shared of 12k shared among a maximum of four users, an IDE instead of a line editor, and a far better language.
Turbo Pascal also blew away anything else available on the PC. I had a C compiler from Microsoft that was far more expensive and far slower.
Turbo Pascal was a truly magical piece of software. I remember that I had some initial skepticism about it. As I recall, it took some liberties with the language. But once I finally used it, I was instantly converted.
IMHO, nothing has quite captured that experience of development until IntelliJ, starting with 4.x or 5.x.