Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There are basically only two “real” reasons to use Docker or containerization more generally:

> 1. People who do not know how to use Unix-based operating systems or specifically GNU/Linux.

> 2. People who are deploying a program for a corporation at a massive enterprise scale, don’t care about customizability and need some kind of guarantor of homogeneity.

Unix is only around because of its use at massive enterprise scale. Very few people were using Unix instead of DOS (or Mac OS or Windows or whatever) for their home PCs; it only got popular and people learned how to use it and later Linux because of its use in business. Nowadays, Docker is the standard packaging system at massive enterprise scale. As such, you should learn to use it






> Very few people were using Unix instead of DOS (or Mac OS or Windows or whatever) for their home PCs; it only got popular and people learned how to use it and later Linux because of its use in business

I would say this part is correct.

Your first statement is incorrect, as phrased, but I understand what you meant. Granted, you would have to wipe out all cloud providers using flavors of unix, most phones and macs to reduce the footprint. That being said, it's unpopular as a desktop OS. Phones and Macs hide it so well that most people are unaware of the underlying OS.

My first Linux machine was on my work desk in 1998, while we were running racks of UltraSPARCs in production.

I use docker extensively for local development in all my projects at home and at work. This guy is wrong about multiple things, eg "Well, if you’re expecting Docker to have a file-system easily accessible, you’re wrong"

I can access my docker OS from: docker exec -it containername bash (allowing that it has bash).

If the container OS has autocomplete and other GNU tools and features, you can get all the functionality. If you want to build that image or even upgrade the image you have (most containers have access to package management), you have a new image you can use the way you like...which might include running more than one service on the same container. Just like using a script on another unix machine, except without having to set up the physical networking or paying a host.

It's very UNIX-y to provide single entry points to services and run them in relative isolation (changes to one container do not affect the others) by default.


macOS (since version 10) is Unix. You can say most macOS users are not using the terminal or that back in the 1990s and 1980s, all the popular desktop OSes weren't based on Unix and that would probably be more accurate.

The massive enterprise scale part is more complicated.

First of all, we need to clarify that the "people who should be know how to use Unix" here are developers and system administrators. Most people don't need to know Unix and that's fun. You sometimes see people (I get the feeling the OP might be lowkey one of them) mourning the fact that that everyone should be running Linux and doing everything through the terminal. This is like saying everyone should be driving manual transmission, baking their own bread, growing vegetables in their back yard, building their own computer from parts, sewing their own clothes... you get the story. All of these things could be cool and rewarding, but we lack the time and resources to become proficient at everything. GUI is good for most people.

Now the deal with developers using Unix is a much more complex story. Back in the 1970s Unix wasn't very enterprise-y at all, but gained traction in universities and research labs and started spreading to the business world. Even well into the 1980s, the "real" enterprise was IBM mainframes, with Unix still being somewhat of a rebel, but it was clearly the dominant OS for minicomputers, which were later replaced by (microcomputer-sized but far more expensive) servers and workstations. There were other competitors, such as Lisp Machines and BeOS, but nothing ever came close to taking over Unix.

Back in the 1980s, people were not using Unix on their home computers, because their home computers were _just not powerful enough_ to run Unix. Developers that had the money to spare, certainly did prefer an expensive Unix workstation. So it makes large (for that time) microcomputer software vendors often used Unix workstation to develop the software that was later run on cheaper microcomputer OSes. Microsoft has famously been using their own version of Unix (Xenix) during the 1980s as their main development platform.

This shows the enterprise made a great contribution for popularizing Unix. Back in the 1980s and 1990s there were a few disgruntled users[1] who saw the competition dying before their eyes and had to switch the dominant Unix monoculture (if by "monoculture" you mean a nation going through a 100-sided, 20-front post-apocalyptic civil war). But nobody complained about having to ditch DOS and use an expensive Unix workstation, except, perhaps, for the fact their choice of games to play got a lot slimmer.

This is all great and nice, but back in the 1990s most of the enterprise development moved back to Windows. Or maybe it's more precise to say, the industry grew larger and new developers were using Windows (with the occasional windows command prompt), since it was cheap and good enough. Windows was very much entrenched in the enterprise, as was Unix, but their spheres of market dominance was different. There were two major battlegrounds where Windows was gaining traction (medium sized servers and workstations). Eventually windows has almost entirely lost the servers but decisively won the workstations (only to lose half of them again to Apple later on). The interest part is that Windows was slowly winning over the Enterprise version of Unix, but eventually lost to the open-source Linux.

Looking at this, I think the explanation that Unix won over DOS/Windows CMD/PowerShell (or Mac OS 9 if we want to be criminally anachronistic) is waaaay too simplistic. Sure, Unix's enterprise dominance killed Lisp Machines and didn't leave any breathing space for BeOS, but that's not the claim. DOS was never a real competitor to Unix, and when it comes to newer versions of Windows, they were probably the dominant development platform for a while.

I think Unix won over pure Windows-based flows (whether with GUI or supplemented by windows command-line and even PowerShell) because of these things:

1. It was the dominant OS (except for a short period where Windows servers managed to dominate a sizable chunk of the market) , so you needed to know Unix if you wrote server side code, and it was useful to run Unix locally.

2. Unix tools were generally more reliable. Back in the 1990s and 2000s, Windows did have some powerful GUI tools, but GUI tools suffer when it came to reproducibility, knowledge transfer and productivity. It's a bit counterintuitive, but it's quite obvious if you think about it: having to locate some feature in a deeply nested menu or settings dialog and turn it on, is more complex than just adding a command line flag or setting an environment variable.

3. Unix tools are more composable. The story of small tools doing-one-thing-well and piping output is well known, but it's not just that. For instance, compare Apache httpd which had a textual config file format to IIS on Windows which had proprietary configuration database which often got corrupted. This meant that third-party tool integration, version control, automation and configuration review were all simpler on Apache httpd. This is just one example, but it applies to the vast majority of Windows tools back then. Windows tools were islands built on shaky foundations, while Unix tools were reliable mountain fortresses. They were often rough around the edges, but they turned out to be better suited for the job.

4. Unix was always dominant in teaching computer science. Almost all universities taught Unix classes and very few universities taught Windows. The students were often writing their code on Windows and later uploading their code to a Unix server to compile (and dealing with all these pesky line endings that were all wrong). But they did have to familiarize themselves with Unix.

I think all of these factors (and probably a couple of others) brought in the popularization and standardization of Unix tools as the basis for software development in the late 2000s and early 2010s.

[1] See the UNIX-Hater's Handbook: https://web.mit.edu/~simsong/www/ugh.pdf




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: