Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What I'm personally bit horrified with is that "Unix" is the end all of OS design and so on. With some fundamental designs going back to 70s... Like everything being text. Which seems quite a mess in current world with multimedia and increased networking. Yes we have spend decades hacking to get it to work... But maybe more options and different designs would enrich us lot better.

And really we do have layer that can connect all of the different systems together. Namely IP. So even if we had fundamentally different systems, there is no reason why intercommunication wouldn't be possible at this point.



Yes, it's not either/or. The web is a kind of meta-OS with a very minimal and simple API, and as far as the web is concerned everything in userland operates as a thin-client. The OS is only of interest locally.

By the mid-90s I was furious with both Microsoft and Apple for setting computing back by at least a decade.

You could - with extreme effort - make an Atari ST multitask. And of course with the Amiga it was built in.

So why did we throw that away and go backwards to DOS and then single-process Windows - which eventually reinvented multitasking nearly a decade later and sold it as if it was the most astounding development in the history of computing?

Of course there were technical challenges - protected memory, protected processes, and so on. But the earliest versions of Windows didn't have those either.

So it was a disappointing and frustrating time - an alienating difference of philosophy between consumer computing designed for creativity and exploration, and box-shifting commodity computing designed for form-filling and bureaucracy, which might allow you to have some fun after hours if you behaved yourself and the machine didn't crash.

Considering how smart the Amiga team were, it would have been very interesting to see what they could have done with ubiquitous fast networking, high-res graphics and video, and pocketability.

I suspect the result would have been far more open and inspiring than the corporate sand trap we have today.


> The web is a kind of meta-OS with a very minimal and simple API

You may have missed the last 20 years of Web evolution, because the Web is now a very messy and hard to implement platform. Even Microsoft gave up on implementing a browser and switched to Chromium.


> So why did we throw that away and go backwards to DOS and then single-process Windows - which eventually reinvented multitasking nearly a decade later and sold it as if it was the most astounding development in the history of computing?

Because cheap commodity harware, a single Intel CPU controlling everything, won out over an expensive custom multi-chip solution.

Every Amiga model was a new model, with large amounts of engineering effort put into it, until eventually the platform just couldn't keep up with how quickly PCs were dropping in cost.

Sure in 1989 an Intel x86 based system was a joke compared to an Amiga, and it was a joke in 1990, and 1991, and 1992, but it was less of a joke every year.

When it all began PC's didn't even have anything that could charitably be called sound capabilities, but year after new cards were released, there was massive competition, at first at the high end, but then sound cards got cheaper and cheaper until one day Microsoft demanded that for a computer to be Windows Certified it had to just include one, and it was up to OEMs to figure out how.

Meanwhile Amiga didn't benefit from that technological explosion.

Same thing happened for graphics.

Same thing happened for networking.

Same thing happened for hard drives, and cd-rom drives, and types of RAM, and literally everything else.

Technology has often been one step back two steps forward, going from mini->micro people complained about the same thing, and going from PCs->Smartphones, it happened again. Remember early smartphones / PDAs? They had a maximum process limit, in the low double digits! Storage that was wiped out if the battery died! Everything ran in the same address space!

And remember the first 5 or so major versions of Android? It wasn't exactly a pleasant system to use.

But it got better.

The thing is, all in one custom hardware solutions will always, at first, beat out general purpose computing. But those custom hardware solutions are expensive, and slow, to engineer. Now if you are Apple and you can manage to find economies of scale to do everything in house, great!

But Amiga didn't have that scale. They had custom in house everything, which, and they were competing against not just Microsoft, but literally every other consumer PC hardware manufacturer on the planet who were in a fight to the death to drop prices on DOS/Windows PCs peripheral by peripheral.


The sad truth is Amiga hardware barely changed from 1985 to 1992. The A500 was nothing more than a cost reduced A1000 - no new features. The A2000 was nothing more than an A1000 with expansion slots. The A2500 was a couple extra boards, not a new model. The A3000, my favorite machine, was effectively a cost reduced A2500, combining the 68030 accelerator board and SCSI onto the motherboard. Again, nothing new.

So after 7 years, all we had were expansion slots and faster CPUs. The OCS -> ECS chipset upgrade was very minor. That's basically it.

Finally, at the end of 1992, the AGA machines (Amiga 1200 and 4000) were released. We finally had upgraded graphics that didn't even keep pace with SuperVGA. The A1200 was also totally gimped with no fast memory and a slow 68020 that was now over 8 years old...


> You could - with extreme effort - make an Atari ST multitask. And of course with the Amiga it was built in. ... So why did we throw that away and go backwards to DOS

Mine is a US-centric perspective, it but always seemed like Commodore and Atari were selling niche follow-ons to their eight-bit lines, with momentum heavily toward DOS, even in the second half of the 1980's.

Open up the back of a 1987-era computer magazine, and the pages are full of companies that will sell you a DOS machine, expansion cards, operating system, etc. All these participants in the market are sinking money into the platform to try to establish a competitive advantage by making something better. (The Compaq Deskpro 386 and various video cards come to mind immediately as places where the market rapidly outran the original developers of the platform.)

But if you want an Amiga, you're stuck with Commodore. If you want an ST, you're stuck with Atari. Two individual companies both trying to build an entire _platform_ - custom hardware and OS - that's competitive with the collective output of an entire segment of the industry. I guess it's easy to say this now, but both of those companies picked a hard battle to fight - and both did very well, all things considered.

> ... single-process Windows - which eventually reinvented multitasking nearly a decade later and sold it as if it was the most astounding development in the history of computing?

The development wasn't the technology, the development was getting that technology into a delivery channel where it was relevant to a large group of people. This is why Windows 3.0 was such a big deal - there wasn't anything technically novel in it, but it was cheap, worked with what the customer base already owned, and attracted enough investment to overcome its (many) faults.


This reminds of Robert Pike's "Systems Research is Irrelevant" speech [1]. Now, 20 years after his speech, we are still stuck with the same notions (such as everything being a string). It's not that there are not plenty of alternatives around, however, expectations are so high that it's almost impossible to make a new computer system economically viable. On the other hand, the hacker and maker scene is very active, some of them building operating systems and hardware such as tiny Lisp-based machines [2] and OSes [3]. (My only gripe is that most of the new "avant-garde" systems are still text/file-based.)

I'd love to see a next wave in personal computing, starting with a clean slate, building on the research, insights and learning from the mistakes that have been made. I have no doubt that it will happen, the question is only when.

As for interoperability: Even on the same platform there are countless problems getting software to talk to each other, so I don't think that a new system will make the situation any worse.

[1] http://www.herpolhode.com/rob/utah2000.pdf

[2] https://www.tindie.com/products/lutherjohnson/makerlisp-ez80...

[3] https://mezzanos.herokuapp.com/


Good enough, cheap, source code available wins in the long term. Also worse is better. But even worse was a lot better than DOS.


IMO Unix hasn't taken over as much as people think it has. If you look at an OS more closely they typically have a POSIX API layer on top of whatever unique ideas they have. Even Linux does quite a lot of its own thing.


People don't know or underestimate how much Unix has taken over. From cellphones to super computers. From consumer devices to industrial control. Unix servers made and continue to run the internet, aka everything.


This! Also, we take now many unix abstractions so much for granted that when they occur in different places we assume it's just convergent evolution.


POSIX more than anything, many of those IoT OSes aren't anything UNIX related although they support some form of POSIX.


Most of the "new" Linux ideas you will find them in the mainframes world.


I think this is largely a factor of PCs getting more powerful and the lag to libre implementations. Trying to build container infrastructure during the pentium 90 days wouldn’t have succeeded.

See “the innovators dilemma” for the process.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: