There's also an argument to be made about complexity. While the average home user in the past could with effort learn the ins and outs of a DOS system, there is no way the average home user would have the time, energy, or experience to learn the ins and outs of how Windows 10 it 11 even begins to work.
The author briefly mentions that Arch Linux users are making an informed choice. But he fails to mention how few users, even amongst the hardcore Linux crowd use a base Arch build with no pre-built configurations. Most Linux users in the Arch community use Arch based distros like Manjaro.
Getting a base Arch distro is fairly hardcore, you boot into an environment that has minimal drivers and is for all intents and purposes, is not functional. You then spend the next many hours configuring it to get a basic, kinda usable system. Followed by many weeks and months of tweaking to get it to a place where you might be happy to use it on a daily basis. This is because modern computing systems have hundreds or thousands of components that need to work together, most environments hide the integration of all of these parts to the user, because even the most sophisticated and educated users find it tedious to manually integrate all of that.
And Arch is considered baby steps compared to say Gentoo or even Linux From Scratch. Which don't even try to hide operating system level details from the user like Arch does. Quickly if you play with either of those environments you will learn that modern general purpose computer operating systems are many orders of magnitude more complicated than 50 years ago. That's why they require teams of professionals to tune properly.
There are whole businesses in enterprise computing (Redhat, SUSE, WindRiver) whose entire business revolves around helping customers tune their OS for production loads. This is entirely because OS's have become so overwhelmingly complex that most companies can't afford to keep in-house expertise.
Now, there's an argument that maybe all of this complexity isn't necessary. But efforts to cut down on complexity has led us down the path of specialised computing, largely powered by hardware acceleration using FPGAs and ASICs, which is an entirely different beast to software.
To summarise the rant: In pursuit of performance, and quality. We've made computing enormously complicated with massive barriers to entry. It would not be feasible for users to have to learn about the nitty gritty details of computing before being able to use their computers.
Yeah, when I learned to program, you needed to know a few things:
- How to output text to the screen (and maybe to a file or serial port)
- How to read input from a keyboard (and maybe from a file or serial port)
- How to do basic conditionals, looping, and function calls (if, while, for, gosub/jump/return, etc.)
And that was about all you needed to make a computer do anything that a store-bought professional program could do.
Languages, even BASIC, made those things pretty simple. But also learning the OS interrupts and doing them directly in ASM wasn't that hard.
Now you first have to learn 75 layers of abstraction and API stuff just to properly instantiate windows, take focus, etc., nevermind the language and actual domain logic. You can do a whole lot more, and do complex things a lot more simply, once you learn all that. But you can't just do the simple things simply. It's no longer an option.
So of course there's no learning path that takes that option anymore. And since people can't learn to do simple things simply, they never get that same feeling or learn deeper. It's just a black box that sometimes/usually lets you do things. But you don't have control over it.
There's also the problem of what is 'simple' has changed over time.
50 years ago, simple was printing something to console.
Nowadays, simple is drawing a graphic in a window.
If you give a child or teenager a terminal, and show them how to print some text to console. They will not feel the magic. In fact, in my experience, they often miss the significance altogether and immediately ask me how they can turn that skill into building an app or creating a game.
So learning to write software has to be reduced to small academic problems first. Like when you start learning maths, you start with arithmetic and move your way up, and eventually you find applications for it.
Yeah, specialized computing leading to FPGAs and ASICs is quite an interesting path, I've always wanted to compile my code as a processor instruction/co-processor/fpga thingmajingle to cut down on all the bloat at once, mister is so much on my to-buy list.
The author briefly mentions that Arch Linux users are making an informed choice. But he fails to mention how few users, even amongst the hardcore Linux crowd use a base Arch build with no pre-built configurations. Most Linux users in the Arch community use Arch based distros like Manjaro.
Getting a base Arch distro is fairly hardcore, you boot into an environment that has minimal drivers and is for all intents and purposes, is not functional. You then spend the next many hours configuring it to get a basic, kinda usable system. Followed by many weeks and months of tweaking to get it to a place where you might be happy to use it on a daily basis. This is because modern computing systems have hundreds or thousands of components that need to work together, most environments hide the integration of all of these parts to the user, because even the most sophisticated and educated users find it tedious to manually integrate all of that.
And Arch is considered baby steps compared to say Gentoo or even Linux From Scratch. Which don't even try to hide operating system level details from the user like Arch does. Quickly if you play with either of those environments you will learn that modern general purpose computer operating systems are many orders of magnitude more complicated than 50 years ago. That's why they require teams of professionals to tune properly.
There are whole businesses in enterprise computing (Redhat, SUSE, WindRiver) whose entire business revolves around helping customers tune their OS for production loads. This is entirely because OS's have become so overwhelmingly complex that most companies can't afford to keep in-house expertise.
Now, there's an argument that maybe all of this complexity isn't necessary. But efforts to cut down on complexity has led us down the path of specialised computing, largely powered by hardware acceleration using FPGAs and ASICs, which is an entirely different beast to software.
To summarise the rant: In pursuit of performance, and quality. We've made computing enormously complicated with massive barriers to entry. It would not be feasible for users to have to learn about the nitty gritty details of computing before being able to use their computers.