Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Home Computer Generation (datagubbe.se)
170 points by stargrave on July 19, 2022 | hide | past | favorite | 133 comments


I'm firmly in the "home computer generation" and it's becoming more frustrating than ever. It's frustrating that everything is going/has gone mobile, because mobile sucks. It doesn't matter what form factor or how fast the phone is, it's just easier to get up and walk over to a computer and do it 100x faster and easier. (obviously if you aren't near a computer or use specific apps, this isn't always the case).

The problem with how accessible "computing" is these days, which really means tablet and mobile, is that now everything has to be mobile first, which makes everything crappier and less configurable. It gives less power to the power user and we're the only ones who can see that loss. The market is so big now that few billion extra people actually regularly "compute" that of course every company will just keep digging down the rabbit hole to crappier and crappier experiences tailored for mobile only folks and no one will notice but us.

I have a business that sells custom items with a live preview. It fits and works wonderfully on a computer. You can see the preview, the inputs, the helpboxes, dropdowns and everything. But now the entire world just surfs where the experience can't help but be gimped by the size of a phone and how it operates via touch and in embedded browsers half the time. It sucks because people struggle through a process that would be 100x easier & faster if they just opened it on a goddamn computer, but they'll spend 45 minutes fighting through it on a phone. Maybe they don't know any better, but I don't ever see a future where they will and that's sad and depressing. Technology is just getting worse. I can't even get an SD card slot in a phone any more and people don't care or understand why that's insane.


Part of the reason for the push for mobile is that a big chunk of the tech industry nowadays competes with power users - if you're a power user you may not need their "solution" as you will defeat their attempt at rent-seeking by using your existing, powerful tool (the general-purpose computer) that is built to serve you rather than them. Worse, a power-user sharing their solution with others would also obviate the need for the rent-seeker's solution. A power user is absolutely not something a typical "growth & engagement" startup wants because they'll be a troublemaker, constantly poking holes into their bullshit business model.

Essentially, we had a gap where computing devices (whether mobile or desktop) became mainstream and it was just a matter of time before laymen would eventually learn how to use them to their advantage, except the VCs beat them to it and started weaponizing the devices to not just exploit laymen's inability to use them but eventually reshape the devices in a way that prevents them to be used for the user's benefit even if they somehow gained the knowledge to try.

In the old times you could pay someone a one-off fee for them to teach you fishing and then fish on your own. Nowadays, standard fishing rods are being pushed out of the market (along with the knowledge of using them), being replaced by fishing-rods-as-a-service that rely on your perpetual lack of fishing knowledge to seek rent since they now have a monopoly on fishing (and everyone's lack of fishing knowledge means a competitor making non-DRM'd fishing rods would have a very hard time starting up).


It's so true, most apps on mobile are just crappy partial tools that you already have on a PC, but worse to use and with a subscription. It always feels like technology sucks nowadays and it's just getting worse. It is no longer meant to serve us, but someone else who will abuse us for money.

I can't rely on tech or apps to do what I want, how I want it and with any privacy or reliability. It's just no longer the future that's in the cards we all dreamed about. More traps than tools. Selling a fantasy that you can talk to the air and turn a light on or ask a simple question I could type in 2s and get faster info that is more reliable, whoopdee-doo, I only had to pay for a device that spies on me 24/7 to exploit me.

They market a fake dream that makes our life look easier, cooler and more convenient, but it's definitely quite the facade once you take a step back to look at it.


> It is no longer meant to serve us, but someone else who will abuse us for money.

As predicted by Richard Stallman. People hate him because of the conclusions he draws from his principles, even though they turn out to be right every single time.


Nah. It's because people look at the conclusions he draws, that foss will benefit users, devs and the world with no downsides, and then look at the reality of the shit show of a world built on foss that we actually live in, and conclude that the organisations who benefit the most are Microsoft, google, Amazon etc. and the people who benefit the most are billionaire rent seekers and the people who work, or who want to work for them. Based on this they conclude that he's deluded. The answer to this is not more foss, license tweaks or moving off GitHub but less giving away your time and money.


You are taking so much for granted in this comment. World built on FOSS? You should be thanking Stallman on bended knee for making that possible at all. The rent-seeking situation before FOSS was infinitely worse. Today, you can spin up a cluster of servers with state of the art load balancing, databases, checkpointed filesystems, and a standardized OS without paying a cent to anyone in software licensing fees. Do you even realize the magnitude of this achievement?

Sure, billionaires benefit - so do the rest of us. A rising tide lifts all boats. Don't blame FOSS for the evils of capitalist rent-seeking; the rent-seekers of old have been toppled.


haven't read his stuff in depth but isn't stallman for copyleft not all FOSS ? and him mellowing out the copyleft stance, because not enough people are behind it therefore a need to compromise ?

because in a world you're required by law to open source and have the same copyleft license or stronger if you're using copyleft based projects that rent seeking would not be possible, but so is making money out software and the majority of the folks here and in the tech industry really don't want that.

I really wish be had that world, software would be a single community project, but less then a tenth of the people in the industry today would exist (and there will be no 'industry').


> Part of the reason for the push for mobile is that a big chunk of the tech industry nowadays competes with power users

It really doesn't. It's playing in a different league - tailoring their solutions to the mass market of those who have zero time for that power-user level of technical sophistication. It could market power-user friendly versions of its products and services, but it's way too busy pushing them for uber-inflated prices in the "enterprise" market - where power-user friendliness is just one more box to tick, hoping to make a sale.


While I think smartphones have their (limited) uses, most of them don't appeal to me for one simple reason:

Why would I read anything on a cashiers receipt/slip if I don't have to?


What about knowledge work? Employees in the tech, finance and any desk job industry aren't performing their job using an ipad or phone.

Wouldn't the tech industry have an incentive to also encourage a continuous supply of skilled labour that can use a pc effectively?


> What about knowledge work? Employees in the tech, finance and any desk job industry aren't performing their job using an ipad or phone.

You might be surprised. I have several coworkers who are doing Olympic gymnastics to get away with using iPads as primary devices. There's only a few of them right now but they are multiplying rapidly.

> Wouldn't the tech industry have an incentive to also encourage a continuous supply of skilled labour that can use a pc effectively?

Yes, but that sounds like a problem for the Zoomers and Alpha. That sounds dismissive but I'm not sure we could solve it if we wanted to. It's too far away. By the time it becomes a problem maybe everyone can work on tablets, or phones.

I maintain that the current model of "software development" is more like "software manufacturing" and will be automated away in the future to the point we wouldn't recognize it today. In 50 years software professionals will look back on how we work today and wonder how we got anything done. Not too different from the way we look back on the 1970s today. How the heck does a slide rule work anyway? Can you imagine working with punch cards?


> Wouldn't the tech industry have an incentive to also encourage a continuous supply of skilled labour that can use a pc effectively?

Given the shit-show that Windows 10 is, it doesn't appear to be the case.


The mobile first trend is being pushed in large part by millions of people from developing nations coming online who have no access to a PC or PC like device.


I think that is probably a big part, but a bigger part is 90% of people in first world countries (who have money to pay and are valuable to advertisers) have never known how to use a computer (too complicated/scary) and now they all have one in their pockets they are addicted to. And since they never used computers they aren't as immune to being scammed, ripped off and fooled with fake dreams as us who lived through it all for years.


What makes your conspiracy theory about addicted developed-nations computer users more likely than the fact that strictly more human beings live outside of developed countries than within them? Are there any priors here? Is this a question of revenue per user? Is it a matter of cultural affinity? Is there any reason at all to discount the theory of more humans in developing nations?


A typical software development shop is still going to have more options for profitability primarily aiming at first world markets. And it's not even just because the customers have more money - regulatory challenges and difficulties navigating political realities in 3rd world countries make it hard to target new apps at such markets. My previous company had exactly that challenge - we focused not only on mobile- only but older/lower spec devices hoping to break into developing markets. They never succeeded and ended up releasing exclusively in Aus and UK, where many customers would actively prefer to use desktop-based interfaces.


Many developing markets don't necessarily have devices that are much older or lower spec. Look at Malaysia, Singapore, India, China, Taiwan etc. Many people upwards of middle class in these places have good access to reasonable devices.

Rather, they have different user behaviours and different needs. Success in these markets is about responding to customer needs locally, and not presuming that conventions, ideas, and designs that make sense in rich western countries make sense in those markets.

I've seen some good arguments that GUI layouts should be completely reexamined in developing markets, because the realities of differing languages (top to bottom, or right to left reading), or differing conventions in the presentation of information (extremely dense, or extremely sparse relative to what we are used to) mean that many app GUI's are completely counter intuitive and difficult to use in these markets.

Uber for example failed to succeed in India and Singapore because they couldn't adapt to the local markets need to pay in cash. Even though credit cards are available, many customers prefer to use cash. It was such a strong preference that they would rather not use Uber than pay for their taxis with credit cards.


We never even got past the regulatory hurdles and commerical agreements needed to launch at all. Btw I would hardly consider Singapore or Taiwan developing markets - we were targeting a few east African nations and had pretty good stats on what devices were common (hint: definitely not the latest iPhones!).


Because mobile gadgetery started in the so called "first world", exposing the people there for a longer time to marketing.


That but it's also pushed by superior availability. I don't know how many times I lay on my couch and do some information work on my phone instead of taking out my laptop and doing it there faster. Often the laptop is right next to me but it feels easier to pull out my phone instead of opening my laptop. I think it's mostly due to knowing that my phone is always with me. If I want to use my laptop, there's a possibility I have to physically move to get to it. So it's less work to go for my phone directly.

This all made me realize how hooked up I am to my phone. I take it literally everywhere I go even inside my house.


The pain and frustration from using a phone is enough to beat my laziness and actually make me just walk to a computer for most anything apart from mindless scrolling/consuming. Looking for food and need to use actual websites or order from one? Just get up and save yourself the frustration.


> The problem with how accessible "computing" is these days, which really means tablet and mobile

Tablets and smartphones don't count as computing. They are tools for viewing, curation and maybe some lightweight creation, but they are appliances. They do make whatever they do more "accessible", but whatever they do is no longer computing.


> It's frustrating that everything is going/has gone mobile, because mobile sucks. ... The problem with how accessible "computing" is these days, which really means tablet and mobile, is that now everything has to be mobile first, which makes everything crappier and less configurable.

Yes, I agree that is the problem. Now both mobile and desktop computers are being made "crappier and less configurable". And computers that you cannot program, that take a long time to start, trying to work against you, etc. That is why we must make better software (especially a better web browser than the bad stuff that exists now) that isn't bad like that. That is why I do it too, and not only me, but other too hopefully will do.

Furthermore, I also hate touch screen.

> Technology is just getting worse.

Unfortunately, it is getting worse. Not only programs but also the standards, specifications, etc. They don't design computers for advanced users, but they should do. They add too many animations, DRM, telemetry, advertisement, and other stuff. The software is slow because it is badly written (but it is possible to make a faster program in C). Things that might be useful to display if you can fit on the screen are not actually displayed, instead they make it worthless display. They always put GUI, too big icons/fonts, etc. Things are not properly tested. They try to make the computer believe they know better than you and you cannot enter any good command. They try to stop you operating your computer properly. They will not make good documentation. You will have to use their badly designed software for the TV service and many other things. They design specifications badly, and insist computers to use it. HDMI is bad. USB is bad. UEFI is bad. Unicode is bad (no single character encoding will work for everything, and yet they insist that it will). HTML+CSS+JS is often bad (and they try to use it for everything, which is inappropriate). They still insist to use it, though. Things are patented and that makes it worse, too.

I think that the good design is that the end user has full control and programmability, and try not to add too many animations and other stuff like that, and ensure that it has comprehensive documentation, and don't try to believe that the user is wrong. Do not assume that the user interface is understandable, because without documentation it will not be understandable. Do not make the functions require a lot of menus to access; you can assign a key combination. I think is good design making it UNIX philosophy, is you have enough ropes to hang yourself and also a few more just in case.

One quotation from Steven J. Searle: "The sad fact of the matter is that people play politics with standards to gain commercial advantage, and the result is that end users suffer the consequences. This is the case with character encoding for computer systems, and it is even more the case with HDTV."


>I think that the good design is that the end user has full control and programmability, and try not to add too many animations and other stuff like that

Approximately 100% of the population cares much more about having a nice UI than having full programmability.


But "approximately 100%" is not the same as really 100%. That is (one reason; not only reason) why alternative programs must be written and be made available, so that not everyone has to use the same (bad) one.


more computer literate than the generations both preceding and, to the confusion of the aforementioned pundits, succeeding us.

This is the most terrifying thing to me. I spent my early adulthood watching all my leisure activity interests like sci-fi, videogames, superheroes, the web, etc go far more mainstream than I could've imagined without bringing along the behaviors they fostered. How can someone spend five hours a day online without caring how it works? It's like being a professional chef without knowing what a farm is. I expected the youth of today to be technological wizards, not a bunch of trained monkeys.


How many pieces of civil infrastructure, bridges, roads, walkways, gutters, buildings, etc do you use but not understand? Life is very complicated and most people only bother understanding the parts that they care about. It may be software for you, but it isn't for everyone.


> How many pieces of civil infrastructure, bridges, roads, walkways, gutters, buildings, etc do you use but not understand?

The question rather was:

> How can someone spend five hours a day online without caring how it works?

Thus: How many pieces of civil infrastructure, bridges, roads, walkways, gutters, buildings, etc. that you use would you love to understand if you had sufficient times for learning?

Answer: Nearly all of them.


Exactly. I'm not a civil engineer but I know a concrete bridge is composed of hardened slurry and that its shape transmits the weight of objects on it into its columns or to its anchored ends. The level of ignorance we're approaching is like kids not knowing why they only drive on one side of the road and someone saying "that's ok because the car won't let them cross over the center line".


> Answer: Nearly all of them.

For you maybe, but I don't think most people even care. Despite many Americans spending hours driving for their commutes, I don't think most even understand what goes into classifying the difference between an arterial, a state highway, and an Interstate. Despite many Tokyo residents using a myriad of trains to get around, most have only a dim understanding of how their rail system works.

But they can care about other things and that's fine. Maybe they're passionate about making art, writing books, or cooking food.


Before smartphones, people just sat five hours a day in front of a TV without caring how it works. Many older people still do.

People who are curious about this sort of stuff, and want to tear open black boxes (sometimes literally), are a minority, and have always been. And there’s nothing inherently wrong about that.


I'm an older millennial, aka Home Computer Generation. The overwhelming majority of my peers have and had no interest in how computers work, even though they grew up with desktop PCs. I had no problem paying for beer in college by fixing basic issues with personal computers. That expertise has never been widespread. It just peaked in concentration before the move to mobile.

Consider that the people pushing that mobile first world are themselves the Home Computer Generation. The Boomers built the PC. Gen X and the Millennials are the ones that built the mobile-first world.


Millennials are NOT the "home computer generation", per the article. You've missed the point; the home computer generation were the ZX81/Atari people, who had to work out how the equipment worked.

The "mobile first" people are developers; honestly, users don't give a shit whether developers want to prioritise mobile. And developers aren't users; on the whole, developers understand the tech in ways that modern users don't have the first clue about. That's the same as it always was: the "home computer generation" wasn't a generation; it was a subculture. Most people knew nothing.

Now everyone has multiple computing devices, annd they have no idea how they work. Back then, you HAD TO figure out how they worked, or they were useless.


A millennial is someone who came of age around 2000-2010. I did, turned 18 in 1999, and I grew up with home computers. Early millennial/late gen-X grew up with home micros, later millennial grew up with Windows 95+ PCs (which are plenty complex enough to force you to learn a lot about computers).

And yes, it's true not everyone in this generation learned anything about computers. Even in middle school in the nineties in a very developed country, I reckon less than half of my class had a computer at home.

But those who had, even outside the "subculture", learned a lot which people generally don't know today.


From TFA:

> Scores of people owned what was then called a home computer - those very special machines produced between the late 1970s and late 1990s aimed specifically at the consumer market: ZX Spectrum, C64, Amiga, Atari, and, overlapping with the business side of things, PC:s and Macs. A lot of us used our machines for little more than playing video games, but even in doing so, many of us developed a very special relationship to our machine in particular and digital technology in general. I believe this relationship to be unique to this particular period in time and that it resulted in a very particular kind of computer user.

> We love naming generations and ascribing vices and virtues to them. Boomers. Millenials. Zoomers. I propose that in the overlap between Generation X and Millenials, there is a home computer generation.

I was born in the first 20% of the Millennial generation according to Wikipedia. We didn't always have cable and we didn't have a Nintendo until the late 90s but we always had a desktop PC. The first PC I remember using was a 486 that my parents bought from my uncle. He was running a little custom computer shop named after a Clapton album. My first PC memory is playing Pharaoh's Tomb with my Dad and wondering what the incantations on the post-it really meant. I remember how excited Mom was when we got our first computer with a whole encyclopedia on CD-ROM and a hard drive big enough to "hold a library". I got my own PC in the mid 90s. A no-name Pentium. It was a hand me down from Dad's coworker, he made me promise I'd learn something with it. That was my first experience with Linux. Mandrake as I recall. I wasn't old enough to drive. In high school I took a computer tech class, learned HTML, how to build PCs, helped wire the building for Ethernet, and then inherited the journalism lab for my junior and senior years. One of my first jobs was in a local computer shop. I worked for the IT desk at my college. After graduation I did onsite tech support for Dell before starting my career in data engineering. Which is a path that was suggested to me by one of my gaming buddies.


All humans are trained monkeys. Some of those monkeys are self-trained on technological wizardry.

Why are you comparing hobbies to professional cooking? I'm glad I can cook without having to farm, and I'd expect most people to be glad they can browse the web without expert knowledge in HTTP.


Yes, but you know what a farm is. You know that animals reproduce, that their infants grow by eating vegetable matter. I can't write HTML either, but I know that it's text which describes the content and layout of a web page. I have younger relatives who don't know what a file is. They don't comprehend that a document and a picture and a video and an audio track are all same sort of thing interpreted in a different way.


Most people's (in the US's) mental image of a farm is probably way different from the likely-very-industrialized farms that are actually producing most of their food.

Strikes me as fairly similar to the surface-level of understanding they have about "apps" and "operating systems" and "memory," actually.


To understand the difference between a manual and a highly industrialised process doesn’t matter here. It’s about the underlying principles that define how things work.


> Why are you comparing hobbies to professional cooking?

I can't tell whether you really don't understand or if you're straw manning an analogy and pretending innocence; trolling.

It's called a simile. It doesn't require all aspects of the two things being compared to be equivalent.

The things are not being compared on the basis of the professional status of the "trained monkey," but on the understanding they have of the sources of the things they use.


The mobile- and social-centric Web is the new TV. Do you care how your TV works? Most people aren't using it for anything important, they've got no reason to care.


A neighbor of mine, a professor at an excellent private college who teaches a stats class where the students are required to do a bit of programming, told me the other day that she's noticed a cycle in her students over the past few years.

15 years ago the students tended to come in knowing more about navigating their computers than she did; now, and especially in the last 4-5 years, she's increasingly having to teach them basic computer literacy in order to get to the learning they're actually supposed to be doing.

This has contributed to my resolve not to let my son have an iPad, but he can have a laptop pretty much when he wants one.


I had a friend tell me recently that he had to teach a (high school) intern how to navigate the Windows taskbar and use the Control-C and Control-V shortcuts on the keyboard. I thought he was joking until he mentioned that the student mostly used mobile devices at home and school and rarely used a PC. Apparently schools nowadays are replacing their laptop fleets with iPads and all that.


Back in the day you would learn this stuff from an actual, printed manual. Proper documentation was a major part of what you got when buying a retail software product, even when bundled with OEM hardware. Where's the user's manual for modern OS's? Even the inbuilt "help" documentation has disappeared.


Prompted by your post I went looking for some, typing 'help' into the start menu search in Windows 11. An app came up called Get Help.

It opens with an ad banner at the top: "Increase productivity and collaboration all while staying organized, using a new meeting solution designed for small businesses. Learn More". Out of morbid curiosity I chose to Learn More but "We are sorry, the page you requested cannot be found."

There are several features in Get Help beyond the primary feature of linking to a broken URL. You can sign in, you can rate your experience out of five, or you can send feedback. There is no Contents or Index, but there is a search box with a few suggested searches like 'How to install Office'. I tried searching 'how to copy paste' (without quotes) but "Your search did not match any solutions." It suggested I sign in to contact support. Closing the app is a little tricky because the minimise/maximise/close buttons are the wrong colour - white on pale grey.

Before closing the app, I selected a rating of 1/5, although my rating had been cleared upon reopening.


> Prompted by your post I went looking for some, typing 'help' into the start menu search in Windows 11. An app came up called Get Help.

I just tried F1 in both Firefox and JetBrains Rider. Nothing. I wonder when that one disappeared. Though at least both have a proper online help.

edit: Trying it in all open apps, some kind of help opens in MS Teams, Discord, Outlook and Directory Opus (file manager).


The empty bloatedness of a stinking, rotten cadaver. And countless vultures fighting over the last pieces. Or trying to sell them.


I would struggle to write a satire of the situation worse than the reality.

In case things change, for future readers, here's a screenshot: https://i.imgur.com/aToMCtE.png


This sounds a lot like my experience with the Help menus in Adobe stuff, I haven’t been able to get to my expensive art tool’s menus from within it for years. I miss when it came in a box with a printed brick of a manual.


I haven't purchased a supposedly user-friendly mobile device in a long time but I don't really recall them having an initial tutorial or something along those lines. You just go through the setup and they toss you at the home screen. In contrast I think the old Androids (when smartphones were just coming out) had a tutorial teaching you how to swipe, pinch zoom, and so on. The assumption is that people know how to use this stuff "intuitively".

The big legacy software suites still have actual help pages and PDF manuals, while the mainstream stuff just has a FAQ on their web page.


MBA: "Tech writers are a cost center. Cut them and we get a fat bonus"


Documentation is sooo oldfashioned. Instant action for satisfaction!


I'm currently in the process of buying various old Commodore hardware (C64, A500, A1200) and investing in better tooling (just got an Oscilloscope) for this reason.

My oldest son (8) and daughter (6) are both interested and at an age where I can introduce them to electronics and computing. When I was my son's age I was already building simple radio receivers (crystal + transistor) so they're old enough to start with these machines.

Laptops will just get used for web games and youtube anyhow.


Did you consider teaching them using something where the learning is more collaborative, where you are not an expert? Scratch, Roblox, etcetera.

Part of the joy of learning the home computer was the feeling of being on the leading edge.


+1, thanks for this anecdote. I'm thinking of coding the world's wittiest oldschool text-only adventure game for our son to make him more comfortable with keyboard input and the command line.

It'll probably also be good for developing creative use of written language (he likes paper books) -- hopes I who learned English and touch typing via Al Lowe's "Leisure Suit Larry" series.

That aside, I'm fine with him playing Minecraft, though. It's a nice fit between uber-realistic contemporary games and 80s computation; also, the Minecraft kids need to use their brain while playing, ha.

Maybe I'm wrong, but it feels like "lousy", rasterized graphics may be better for the child's imagination. It's more close to reading a book where you still have to "draw" the world and characters in your head. But, also, I'm old, and fine with that.

Can't pull the kids out of 21th century, but it makes a lot of sense to introduce them to the most valuable accomplishments of 1980s or 1990s. It's a bit like showing them the simplicity of manual carpentry, chisels and hand planes in the age of electrical tools.


Minecraft is teaching kids a lot about computers, especially if they get the chance to use the more hackable Java version. It's not usually as low level as we got to in the 80s, but there are exceptions to that too (I don't know exactly how old the guy who built a pipelined RISC processor in redstone is, but I suspect they're at least 15 years younger than me).


I've worked with engineering students in a programming class, they do seem to have lots of trouble with things like zipping files, finding things in the filesystem, finding things in drop-down menus, and installing programs. The overall computer literacy levels are definitely lower than I might have assumed...

On the other hand, when I was an undergrad classmates would struggle with many of the same things (although the widespread use of pip has enhanced new users' ability to screw up their computers at scale and give me headaches). And we've all got a funny story about the guy who messed up the group project's share folder, right? Whether it lived on a floppy disk, thumbdrive, or in the cloud.

I dunno. The specifics change, but human creativity when it comes to making mistakes is timeless.


It's bad.

As in, not everyone coming in understands the concept of the right-click.


Maybe that just means that programming tools need to evolve to meet the expectations of the post-home-computer generations.

I'm not sure it's a good idea to require future generations to do things the way we did, e.g. by using a laptop rather than an iPad. I'm not a parent as you are, but I am an uncle. When I was a child, my favorite uncle was a home computer hobbyist, and he taught me a lot of what he knew about programming, particularly in Applesoft BASIC on the Apple II family. When I first became an uncle, I was looking forward to doing for the next generation what my uncle did for me. But the relationship between him and me was a special thing that will not happen again. He started using home computers as an adult hobbyist at around the same time I was born, and my parents probably bought our Apple IIGS based at least in part on his recommendation. Now, there's a wide gap between the way I'm used to using computers and the way that the kids are using them. I think my nieces and nephew, if they have any interest at all in programming, will be better off learning it from someone other than me. And of course, there are so many online resources these days; they don't have to have an in-person teacher for this at all.


I'm rather an oddity here...I came to microcomputers as a young adult, rather than as a kid, but I consider myself very much of the "home computer generation," and being part of it led to my career. Because of the time my children were born, they were very much inheritors of this. My eldest son grew up sitting in my lap, watching the Norton disk optimizer clean up my hard disk. He and his brother build their own gaming machines, etc.

But reading this article, and some of the comments here, I wonder, isn't this what we were always working toward? Isn't it OK that we're reaching a point where computing devices are ubiquitous and nearly everyone can pick one up and make use of it? My kids are digital natives, but my wife didn't find computers useful until she got an iPad, and it's enriched her life greatly. She's not technical, never has been, and always avoided "complicated" computers.

I think we've merely passed into a different era of computing. For some people it's not as fun, perhaps not as lucrative, but for many more people it's (arguably) more empowering and useful.


I feel like the concern here is that we're quickly moving from open ended, content and _capability_ creation era(e.g. I can program anything I want for my PC _WITH_ my PC) to a mere _content_ creation at best and preferably _consumption_ only era of computing. Your iPad doesn't self-host any flavor of XCode that you can build and run iPad programs on.

It seems with a bit of cruel irony that for a moment in history, due to the intersection of affordability and the limitations of computing power, the home computer allowed anyone with the money to the full scope and capability of a computer with "higher level" programming languages. Such that they could not only use available software from others to create content (images, video, audio, text, web sites) but also create _capability_ programs that that made it easy to whatever the limitations of the machine would let them. But as the computer became more powerful, it became "easier" to hide the elements that allowed creating _capability_. We had it for a while, now it seems to be fading away from common view.

This is what scares me. We're no longer becoming true 'computer users'. We're becoming 'digital content consumers'. Companies are fine with that. I'm not.


I think the drift many people have is that the essence of what a computer is, - exactly the only (up to trivial differences) machine that can be arbitrarily programmed - is being obfuscated by devices that make it easy to consume and hard to produce, an asymmetry that doesn't need to exist.

A few months ago I read on HN this analogy, which I found very apt: it's like there's no space in the modern computing world for a computer-literate "middle class". You're either not interested in computers at all and you're fine with a tablet or a chromebook, or you need to know so much stuff about computers that you might as well make a career out of it.

Sent from my iPhone, how ironic.


There's a lot of space for traditional power users still, but they've become relatively invisible because the new mass-marketed stuff is so much more prevalent.


There's also an argument to be made about complexity. While the average home user in the past could with effort learn the ins and outs of a DOS system, there is no way the average home user would have the time, energy, or experience to learn the ins and outs of how Windows 10 it 11 even begins to work.

The author briefly mentions that Arch Linux users are making an informed choice. But he fails to mention how few users, even amongst the hardcore Linux crowd use a base Arch build with no pre-built configurations. Most Linux users in the Arch community use Arch based distros like Manjaro.

Getting a base Arch distro is fairly hardcore, you boot into an environment that has minimal drivers and is for all intents and purposes, is not functional. You then spend the next many hours configuring it to get a basic, kinda usable system. Followed by many weeks and months of tweaking to get it to a place where you might be happy to use it on a daily basis. This is because modern computing systems have hundreds or thousands of components that need to work together, most environments hide the integration of all of these parts to the user, because even the most sophisticated and educated users find it tedious to manually integrate all of that.

And Arch is considered baby steps compared to say Gentoo or even Linux From Scratch. Which don't even try to hide operating system level details from the user like Arch does. Quickly if you play with either of those environments you will learn that modern general purpose computer operating systems are many orders of magnitude more complicated than 50 years ago. That's why they require teams of professionals to tune properly.

There are whole businesses in enterprise computing (Redhat, SUSE, WindRiver) whose entire business revolves around helping customers tune their OS for production loads. This is entirely because OS's have become so overwhelmingly complex that most companies can't afford to keep in-house expertise.

Now, there's an argument that maybe all of this complexity isn't necessary. But efforts to cut down on complexity has led us down the path of specialised computing, largely powered by hardware acceleration using FPGAs and ASICs, which is an entirely different beast to software.

To summarise the rant: In pursuit of performance, and quality. We've made computing enormously complicated with massive barriers to entry. It would not be feasible for users to have to learn about the nitty gritty details of computing before being able to use their computers.


Yeah, when I learned to program, you needed to know a few things:

- How to output text to the screen (and maybe to a file or serial port)

- How to read input from a keyboard (and maybe from a file or serial port)

- How to do basic conditionals, looping, and function calls (if, while, for, gosub/jump/return, etc.)

And that was about all you needed to make a computer do anything that a store-bought professional program could do.

Languages, even BASIC, made those things pretty simple. But also learning the OS interrupts and doing them directly in ASM wasn't that hard.

Now you first have to learn 75 layers of abstraction and API stuff just to properly instantiate windows, take focus, etc., nevermind the language and actual domain logic. You can do a whole lot more, and do complex things a lot more simply, once you learn all that. But you can't just do the simple things simply. It's no longer an option.

So of course there's no learning path that takes that option anymore. And since people can't learn to do simple things simply, they never get that same feeling or learn deeper. It's just a black box that sometimes/usually lets you do things. But you don't have control over it.


There's also the problem of what is 'simple' has changed over time.

50 years ago, simple was printing something to console.

Nowadays, simple is drawing a graphic in a window.

If you give a child or teenager a terminal, and show them how to print some text to console. They will not feel the magic. In fact, in my experience, they often miss the significance altogether and immediately ask me how they can turn that skill into building an app or creating a game.

So learning to write software has to be reduced to small academic problems first. Like when you start learning maths, you start with arithmetic and move your way up, and eventually you find applications for it.


Yeah, specialized computing leading to FPGAs and ASICs is quite an interesting path, I've always wanted to compile my code as a processor instruction/co-processor/fpga thingmajingle to cut down on all the bloat at once, mister is so much on my to-buy list.

Imagine writing haskel and having it become literal hardware, like https://github.com/clash-lang/clash-compiler


The flip-side of this decay in the richness of the experience is that things are more idiot-proofed for mass consumption.

I believe that's the driving factor behind grandparents saying "kids these days are so good with the computers" etc. Older generations grew up with machines that you could actually ruin unless you read the manual, leading to hesitancy and trepidation with the new stuff. (Which often eschews documentation entirely.)

In contrast, younger generations are more likely to assume (often correctly) that they are free to try randomly poking icons and twiddling dials until something looks promising. Seen from the outside--especially by that older generation--this confidence can be mistaken as expertise.

(This is similar to how some people will consider you a magician if you open up a command-line prompt.)


The flip-side of this decay in the richness of the experience is that things are more idiot-proofed for mass consumption.

I liked the internet a lot more when the idiots couldn't use it.


That is a disturbingly elitist attitude. For some of these so-called "idiots", the Internet is their only way of connecting with communities that may not exist locally. Think, for instance, of a person who went blind late in life, who can connect with other such people even though they're in the middle of nowhere, thanks to the Internet. We shouldn't require such people to master arcane computer stuff as well. As someone who has developed software catering to this exact group of people, I have indeed been annoyed at their lack of proficiency sometimes, but I'm glad I could help them get connected.


A disabled person is not an idiot, nor is someone who's had the misfortune of no exposure to computers. An idiot is someone who thinks they're entitled to wield immense communicative power without the slightest effort or responsibility. It's every bit as easy to use the internet to destroy your life now as it was twenty years ago, perhaps easier. Maybe having to put a little work into it makes you respect the paradigm shift just a tiny bit more.


But this isn't true. The selling point of a Commodore 64 for education, said empathically by Jim Butterfield, was that you didn't have to worry about breaking the machine. As soon as computers started coming with hard drives/without the OS burned into ROM, that was not true anymore. It was quite easy to soft-brick a DOS PC by messing with autoexec.bat, and there were legitimate reasons to mess with autoexec.bat!

Conversely, some of the environments kids learn programming in today are very hard to brick. The micro bit, for instance? I'm not sure it can be done, even though you can get very low-level access to it (even bypassing Arm's little RTOS is easy). And of course Java and JavaScript come with sandboxes. Using a computer is pretty damn unsafe, with all the scams and Trojans and pedophiles and whatnot, but playing around with it is back to being fairly safe.


You won't get very far poking randomly in a command prompt though - at least a few memorized spells are required to perform magic


I think this mostly isn't true? As in, if you're patient you'll be fine. Some iteration of "help" will likely get you going, and that's in the odd case that you don't have Google by your side.


You might be surprised much time kids have on their hands. Typing help on old school dos prompt gave some hints iirc. Somehow I found qbasic, maybe just listing some directory which started an endless rabbit hole of self learning for me.


> Somehow I found qbasic, maybe just listing some directory which started an endless rabbit hole of self learning for me.

The qbasic IDE was incredibly intuitive, it's sad that we don't have something closely modeled on it today (or on Turbo Pascal/C++) as a default terminal-friendly dev experience. Equally disappointing that the most common window-based IDE is some sort of Electron- and JavaScript-based monstrosity.


You used to get a manual to read.


Mostly false. It's true that they've done the bad thing of "reducing the possibility space by turning towards 'buttons' away from 'text,'" -- but they have not really made things any safer.

Namely, you can still, perhaps MORE, easily blow your computer up by clicking the wrong email. So, no, not really much idiot-proofing has actually occurred.


In most countries, grandparents probably didn't even have an opportunity to get near a computer when they were young. They didn't have the opportunity to ruin anything, because there was nothing around to ruin. But they will still repeat this trope of "kids these days are so good with computers".


Great-grandparents pretty soon. Early GenX (say late 1960's births) and younger grew up with Pong and VCRs and know how to "run the machine."


Being born in 1985 and having grown up with home computers, starting with the IBM PC jr and BASIC, through to a 386 with QBasic, and then a Pentium with Visual Basic, this essay resonates with me a lot.

It's not just the actual knowledge I picked up which has served me well, but also gaining a general intuition about how computers work. Probably most important of all though is learning how to learn. It's this skill that I am sure I will rely on most over time as my esoteric knowledge becomes more and more irrelevant.


Earlier versions of Windows and like programs used to be far more discoverable. Self documenting.

The menu system could be intuitively accessed by a mouse, without learning anything specific.

It was common for Alt + Underlined menu letter to Open that menu directly. Underlined letters IN the menu items would quick-select an item from that menu. Items that were expected to be commonly used, that had direct keyboard shortcuts, told the end user the shortcut.

The design was simple, categorized, and discoverable. Self documenting end user education.

Now we have ribbons... useless ribbons that make everything slower.


We've gained a lot, but also lost a lot, from going from mouse+keyboard to touch screen only. There's no more hover to discover, or an ALT key to hold to guide us as to what we can do and what does what. Gestures feel like magic spells that are great if you know them, but completely hidden if you don't. Not everything in the early days was obvious or intuitive, but at least we had more clues to guide us, and failing that, a manual to reference.

Microsoft, IBM, and others did huge amounts of user testing and research to get those early UI conventions you mention, and it served quite well. I lament that it has fallen out of favor.


I’ll add that for casual usage, the iOS ecosystem has been very enjoyable to use after I’ve gotten over the initial shock and have learned some tricks. Many tasks feel very natural and easy for every day usage. When I need to get some serious work done though I am much more productive with a keyboard and mouse. I have an iPad Mini 6 that I put into a case which gives me a usable keyboard and trackpad, and for me it has turned the iPad into a killer small portable device that gives me a nice blend of casual consumption device plus a real productivity device.


Honestly, the ribbon wasn't too bad. Was was missing was letting the user know that alt was available and usable. If the guidelines were followed as they are in office and explorer, pressing alt puts the quick access letter next to all the available items.

What has really killed discoverability, are "over to show" buttons, and the hamburger. Both attempts to save screen space and present a "clean" albeit abridged view of what a user could do from an interface at a glance.


Yeah that intuition is invaluable. It's what makes us the family "computer guy/girl", and is why we can help someone who uses some piece of software every day, that we have never seen before, figure out how to do some thing they want to do.

I try to tell people this, when I help them - I don't know everything about every piece of software, I just have a sense of what the operation they want to do will probably be like, and the confidence to poke around until I find it because I also have a sense of what is likely to be a destructive operation.


Tech Support Cheat Sheet: https://xkcd.com/627/


As is often the case, the mouseover text on that XKCD just makes the comic!


I enjoyed the nostalgia of this piece, but in general I suspect that the proportion of the population who are predisposed to be "computer nerds" as we were often known in the 80s/90s, is pretty much the same.

While the systems themselves may not force those people to be confronted with the inner workings from the first flick of the power switch, the barrier of entry is so much lower now than it used to be, if you are interested in that stuff and want to try it. I was only able to buy an Amiga in 1990 because one of my grandmothers died and left me some money - £399 at the time, but adjusted for inflation that's £800. A suitably motivated person these days could choose to buy a Raspberry Pi for 5% of that and they'd just need a keyboard/mouse/tv which are easy to come by for little/no money.

I don't think we should expect everyone to be computer literate, and I welcome the era of appliance computing for the utility it provides (although I do grumble when my OSes/devices lose features in the name of simplicity), but I do think we should try to expose more young people to "real" computing, so the ones who are predisposed to love it, can get that opportunity. For that reason I'm buying each of my kids a Pi for their 10th birthday - they get two SD cards, one with Raspbian and one with RetroPie and it's up to them which they boot during their screen time (or neither, if they would prefer to just watch TV or play xbox). I was hooked from day one of having a computer in the house, but I don't need them to be hooked too.


I'm not sure about the barrier to entry. Sure, buying a computer is much cheaper, but the things you can do have been intentionally restricted - there's an entire industry thriving on artificial unefficiency - the sort of thing that computers were meant to obviate.

Imagine a teenager wanting to search for & automatically "like" Facebook pages about their favorite band. Where would you even start, considering Facebook intentionally makes that difficult with fear-mongering about "security" and lays traps that will ban your account if they detect automated activity? Same with all the other mainstream services. It's much harder nowadays to make computers do anything "useful" when everything is intentionally siloed into proprietary apps & services that intentionally prevent interoperability.

> For that reason I'm buying each of my kids a Pi for their 10th birthday

I'd suggest getting them an older laptop with good Linux support - a dual-core/4GB RAM ThinkPad can be obtained for relatively cheap and has the advantage that it's still very similar to modern computers especially early on in the boot process (x86 as opposed to the Pi's ARM architecture) so any low-level skills they'd learn while breaking & fixing it would also apply to most computers out there.


> I suspect that the proportion of the population who are predisposed to be "computer nerds" as we were often known in the 80s/90s, is pretty much the same.

Possibly. But the fact that owning a PC essentially forced the exposure to happen undoubtedly caused more seeds to grow. It’s far too easy these days to never encounter that friction, and therefore people with latent skills simply go into something else (or worse, never find another passion).


Reminds me of the Purdue Boilermakers.

The Purdue University has as their mascot, the boilermaker.

It always seems such a weird mascot for an engineering school.

However, when they were founded, steam boilers were at the cutting edge of engineering. They were what powered railroads and steamships. And the high pressures and temperatures pushed the limits on metallurgy and reliability. There were many people killed because a boiler exploded.

So at that time, "Boilermaker" suggested attention to detail, and broad technical knowledge.

Now not so much.

Every generation builds on the foundation laid before, and tries to achieve new things.


Building and maintaining boilers is a highly regulated industry still; with the fairly reasonable justification that they can blow up big if you screw it up. Wonder if they still teach anything specific to that.


Everyone likes to consider themselves exceptional. For a long time, I used to say "oh, this passed me by because I had access to a mainframe (smirk)" but I now realise I was in the cohort, and just made other life choices.

The fact I had access to a mainframe had nothing to do with it. At best? it was post-hoc reasoning.

I certainly knew people who assembled their own from kits, one who made their own from 74xx series logic and discrete parts, many who bought. I didn't do any of these things but I felt comfortable that I could, and I understood many of the underlying principles because .. well if you have access to a mainframe, you "see" the same elements in a bigger form.

Access to a mainframe as a kid wasn't exactly normal, but if you count the number of mainframes in the UK in the 60s and 70s and multiply by 20-50 for the headcount who ran them and used them, its not zero. Across the same interval thousands of people came into ownership of a full blown computer, and I didn't, mostly by choice. I got an Acorn "atom" around the time the BBC micro was launched, secondhand from somebody upgrading. By then, I was close to graduating with a CS degree. The most fun I had was building an external PSU from scavanged parts. The Analogue bit!


> Being a digital native doesn't automatically mean you're computer savvy.

I don't have time to properly dive into this, but adding "origin of digital native" to my todo list.

I feel like digital native has always been about how people who grew up with certain devices use those devices differently than people who didn't grow up with them. People who grew up with desktops, might use mobile devices as if they were smaller desktops. People who grew up with mobile and use mobile as a primary device, might approach desktops as they would with mobile. Old habits die hard. I believed that digital native describes how someone might interact with interfaces, not about how they understand underlying tech. The term would also make more sense when applied to products. You want to know how different demographics use your products.


Growing up with home computers DID give you a different mindset about computers. Sooner or later you realize that you can make this machine do anything you want it do do. And it was quite simple to get started.

No way you'll get this same mindset with modern smartphones.


90% or more of computer users during the "Home Computer Generation" (which to me is anchored around 1985, give or take a couple of years, and Commodore 64) were no more computer savvy than smartphone users are now.

They wanted to play games. The computer had no OS to mess up. Loading up a game from disk did not require significant computer skills and if something went wrong, you just turned it off and back on and tried again.

Maybe a tiny little bit of dabbling in BASIC along the lines of the print-over-and-over loop or a type-in program from a magazine. But computer savvy? Hardly.

The main difference, as far as I am concerned: The <10% that were geeky enough to actually program these computers, or crack video games, or draw cool pictures using graphics software, whatever, that contributed something to the scene, had a ready mainstream audience. Lots of other people had the same machine and and were eager for new stuff to run on it. And things were simple enough that you could write an interesting-to-neurotypicals video game in BASIC in one day (in my case, a cute little platform jumper game that I saw on someone's VIC20, and even though we had much better ones on the C64, it just had this odd, primitive charm - so I made my own version).

These days, the geeks do stuff that is largely only interesting to other geeks. Which is fine, the internet ensures that even this limited audience is huge. But the 90% that aren't geeks now aren't exposed to geek culture at all, not even as armchair dabblers or watchers. And use their magic pixel slates.


It's not this black and white. There's a whole spectrum of computer savviness of computer users during the "Home Computer Generation". I didn't learn programming during that time, but I learned all sorts of things about computers that lead to me eventually having a career as a software engineer.

And I'm sure there were many who did create software/graphics/etc but didn't "contribute it to the scene".


Indeed, much of my current problem solving abilities were developed as a 10 year old trying to make DOS games work in the mid-to-late 90s. Most software dev and ops issues that I run into in my daily work are certainly no more difficult than the problems I had to deal with then, especially with all the info available online now.


Amazing reading ! It puts words on what I feel about technology industry and why I hate it a little more on every passing day.

Big thanks to the author.


Computing just for the sake of it. I miss that. But at the same time I ask myself: why? It looks pointless to me now.

However, back then it didn’t look like that. It was amazing, it was fun, it was full of possibilities. Not anymore.

It’s true that things have changed. But we have changed as well. Maybe it’s more because we changed than because technology changed.


10 PRINT "FUCK"

20 GOTO 10

…still makes me giggle


Oh, in my intro IT classes I do a whole thing about

10 PRINT "John is AWESOME

20 GOTO 10

..and little 10 year old me discovering how to BE GOD BY HARNESSING THE POWER OF INFINITY in your computer.


At around 12 years (having learned some extremely basic Q-BASIC from a book, no computer at home to test it on) I co-wrote a scareware with a friend from school on their family computer. It would fake formatting your computer. We knew about GOTO, but not about loops (or faking them with GOTO). So we copy and pasted the same line 100 times and edited the number in every line :D Good times ;)


I got in trouble for doing this very thing in middle school on an Apple ][. :D


Brilliant! This explains sooo much abt myself!

And there's sooo much more about this to write... abt the letdown of requiring security in modern networked computers, the lack of trust! How this knowledge was built on IRL friendships... not just remote ones...


> The disappearance of OS and program configurability, for example, isn't something you notice or even think about if you're not coming from a place where everything once used to be configurable.

I once became determined to repurpose one of the serial port chips in a Mac so that it sent and received at the 31.Kbps MIDI data rate. Doing that required a certain undocumented incantation with a magic value. I managed to deduce it thanks to an assembly program it took me a week to find in an old magazine. That value was sent to the chip from 'FreeBASIC' ... which could then, viola, send MIDI bytes out the port without needing external hardware.

At the time I bitched a lot about that being made so hard to do. Little did I suspect.


is this the same thing that happened to MMORPG design?

"The Home Computer Generation":"Computer Literacy"::"The Everquest Generation":"Gaming Literacy"

Young gamers starting off in everquest had a hard time of it (just like the author states with getting things to work on computers), then we made everything easier e.g., finding groups, minimaps, pay-to-win" (as the author states with Steam), and now we have a generation of people who don't understand how things work...



What a whinge.

> Being a digital native doesn't automatically mean you're computer savvy.

What it actually means is that these generations operate at higher levels of abstraction. Even the "computer builders" of the past couple of decades merely snapped together lego bricks designed by others.

And anyway, in the "old days" you spent at least as much time looking after your system as actually using it (I was shocked how my PC friends had disk optimizers and anti-virus and whatnot). But back then it was part of the fun, as it is for hams.

But kids these days have too many important things to do than to become computer hams.


> What a whinge.

Strong disagree here.

>What it actually means is that these generations operate at higher levels of abstraction. That may be true, but I don't think it's necessarily a good thing. In the analogy of a car driver, if you don't know how the car works aside from driving it, then you are worse off in many respects. If it behaves oddly, you have no idea what is wrong, no idea how to fix it, and (in my experience) are a lot more likely to pay significantly more to do so.

For many today, computers are a complete mystery - indeed I'd hazard that proporptionally far more people know nothing concrete about how computers work than did when I was a kid. They have become magical devices that seem beyond comprehension for many, and I think that's to the detriment of everyone.

>And anyway, in the "old days" you spent at least as much time looking after your system as actually using it

Also not true. Didn't spend -any- time looking after my ZX Spectrum. Just turned it on and either started writing software straight away, or loaded a game up. Maintenance was not a thing.

>But kids these days have too many important things to do

That has certainly not been my experience as a parent. They have lots of things to do, but I don't think much of it is important. It's just attention-grabbing.


By the same token we could consider watching TV just a higher level of abstraction than being an amateur radio enthusiast. It would be technically correct while completely absurd in the real world.


>> But kids these days have too many important things to do

> That has certainly not been my experience as a parent. They have lots of things to do, but I don't think much of it is important. It's just attention-grabbing.

I think those "important things to do" means "extracurricular activities", that probably isn't so important to you or to me, but we know other parent that doesn't let their children have any time for themself due to those activities, they are.

We're talking about learning a new language, learning to play some musical instrument, or even playing some sport in a local team. Both now and when I was a child I know and knew a lot of parent who stressed a lot, and stressed a lot their child, because they didn't sense them to advance in those activities and pressured them a lot to improve in that.


> I'd hazard that proporptionally far more people know nothing concrete about how computers work than did when I was a kid.

I think that's only true if that "proportionally" means "proportional to the number of people who have to work with computers on a daily basis".

I don't know exactly when you were a kid, but from the ZX Spectrum comment, I'd guess it was in the early 1980s. I guarantee you more people, both in absolute numbers and in proportion to the total population, know concrete things about how computers work now than did then. Huge percentages of the population of the world had never even seen a computer when the ZX Spectrum was current. Computer Science curriculum (and related fields) was still, relatively speaking, in its infancy, and the number of institutions that even had anything that could be reasonably termed a functioning CS (or, again, related) department was fairly small.

Today, yes, there are a lot of people who know very little about how computers work, even though nearly everyone (especially in the Western world) uses computers on a daily basis. There are even a lot of people with tech-related degrees who don't know the full ins and outs of how computers work even at a medium level of abstraction.

But there are so many more people who have had the opportunity to learn about computers because of their ubiquity. There are so many more people who have, either through formal education or otherwise, learned how an operating system loads drivers, or how a file system manages space, or how a program allocates and frees memory, than in the early 1980s.

I believe what you are seeing is the difference between a world where, when there was someone you could talk to about computers at all, they had to know how they worked in order to use them effectively, and a world where computers are so widespread everyone uses them, and so user-friendly that the vast majority of people never need to know or care how to do anything remotely like changing the jumpers on a SCSI drive or the dip switches on a sound card.


I don't think this is true, at least in the UK. The BBC computer literacy project meant that there were BBC Model Bs in every UK school, with tie-in educational materials. In the early 90s my entire class at a run of the mill state primary school took turns pair-programming LOGO on the school's BBC Micro. I don't think you can get more ubiquitous than every single human having programmed a loop and subroutine.


> For many today, computers are a complete mystery - indeed I'd hazard that proporptionally far more people know nothing concrete about how computers work than did when I was a kid. They have become magical devices that seem beyond comprehension for many, and I think that's to the detriment of everyone.

Yeah but did you understand how RAM chips worked? How processors worked? How registers and power supplies worked? Could you wire wrap a board? The previous generation of computer users knew how those things worked and often wired them up themselves.

Most of these arguments are emotional. We, as computer practitioners, are dismayed that the general public doesn't value our knowledge. But should they? Are the kids who actually want to learn about computers not able to learn about them?

I grew up fairly poor as a kid in the '80s-90s and most of my friends didn't have access to a computer at home. They learned the bare minimum they needed in the school library to finish homework assignments but otherwise didn't care. I was interested in computers and would go dumpster diving to find parts, but my friends didn't care. I'd say the cohort that had the money, time, and inclination to have computers as kids in the 1980s was much smaller than those who want to nowadays.


With respect to cars though, they are more reliable and problems are almost certainly harder for them to fix on the road or in the home garage.

It's probably useful to have some notion of how cars operate in general. But I suspect fewer and fewer have deep knowledge and certainly the ability/interest to do their own auto repairs of any consequence.


> my ZX Spectrum

I'd hazard a guess OP meant early personal computers with something like Win 95, not underpowered microcontrollers.


By the time Win 95 rolled around, personal computers were well past the early stage, and the Z80 was neither underpowered nor a microcontroller. So I’m not sure what you’re saying, exactly.


I'm saying that even an Arduino is faster than a ZX Spectrum, so yes I would place it firmly in that category as it wouldn't exactly be able to run anything advanced enough to warrant maintenance.

> By the time Win 95 rolled around, personal computers were well past the early stage

That's a matter of perspective really, in 100 years even what's modern today will be considered early.


Things are mostly fine as they are(Aside from the rise of locked down, mandatory encrypted, cloud dependent stuff), but people definitely have lost some computer literacy.

Not that they lost much that they couldn't google, even programming is easier than ever, the hard stuff is specialist work like OS and hardware design. It's not like we need average people to know ASM and C.

But people do seem to have lost some of the interest. As I've said before, programming is easier than most other human activities, things like playing guitar or being a cashier are not only hard, they require skills that can't even be described fully.

But programming/IT/etc still takes time to learn, and I do think people probably are somewhat fried from all the short form content and less interested in anything that has a slow and careful process.


Programming might be easy but software development is incredibly difficult. The average person will be hundreds of times more successful picking up a guitar or being a cashier than releasing a useful piece of software.

Software development is incredibly difficult and a lot of people who are employed programmers aren't very good at it.


Picking up a guitar might be easy; playing an instrument well is hard. They say it takes ten years; I'd say it takes about that long to be a good software developer. And like being a musician, most people bail out before they've learned. A few people stick with it and still don't get good (the management promotion route is this way!).

Let's pass over the cashier thing; I was a cashier exactly once, on an exchange trip, so in a foreign language. I got the hang of it in about 20 minutes.


Writing stable, production-ready software is a grind. The personalities in software teams are often grumpy. I know many junior engineers who loved writing code and were dismayed at what went into writing production-grade software. Some of them moved to startups where the stakes were lower or they could work on MVPs, many transitioned into roles like PM or sales engineer, but a decent chunk just left software altogether. I love software so I won't leave but I know it's not for everyone.


When I upgraded from an Amstrad CPC to an Amiga in 1989, I gained access to a higher level of abstration, like you said. Unlike the Amstrad, the Amiga had a GUI and a proper command line.

But unlike the information kiosk-esque devices of today, my Amiga let me navigate its layers of abstraction as I saw fit. These layers represented another dimension of usability. They were meant to be used and navigated.

Today we commonly think of a new layer of abstraction as if it "saves us" from the scary stuff underneath. We pave over what's underneath and pretend we are still standing on the ground. (Web interfaces are a good example. If I want to make a custom web widget I have to make it using form elements. If I am making a new widget in a desktop GUI, I make it using the same drawing and event primitives that the built-in widgets use. My new widget then sits at the same level of abstraction as the built-in widgets, and the overall system is preserved.)

Even modern desktops suck in this regard. Navigating the layers of abstraction is like being an archaeologist uncovering scenes of historical catastrophe in the folded and shattered strata of our systems.


I would disagree, that operating a computer or computerized system is not necessarily being computer savvy. That is what the author is trying to say.

I have long felt that when you have experienced the lower or lowest levels of abstraction, even in archaic times, you develop an intuition about how a thing works or how a problem might be fixed, or how to properly scrutinize or be suspicious of a particular new form of tech (itself often implemented at these higher levels).

One example would be diagnosing "bad internet" by evaluating the wifi signal strength & interference from similar channels, the cable/fiber modem basic functionality and firmware version, the cable quality and function, the networking card in the computer, the function of the software behind it, the browser behavior, caching, or getting "wedged" somehow, the DNS resolver or its cache, an OS bug... it goes on and on.

The flip side of this is: everything is fricking multilayered and complicated! And it sucks! But that's unfortunately what we've got when you have so many layers involved.

Disclaimer: I'm firmly in the demographic described in the article.


I mostly agree with you¹, however, it is a problem if the current computing environments does not actually allow people to do anything of their own. Just like 80’s computers had BASIC, there should be Flash-like authoring programs which people could write shareable programs on. But there aren’t! Specifically the sharing part, I mean. You can’t write a simple Poke the Penguin app and send it to your friend for fun anymore. You’re at the whims of enormous gatekeepers and censors.

1. https://news.ycombinator.com/item?id=19106922#19113359


There's environments like Scratch, Jupyter notebooks, Minecraft, and even Roblox where kids definitely build experiences for each other. The problem is, standards have gotten a lot higher these days. When I was a kid, I fooled around and made RPGs that looked similar to the ones that my friends played on home consoles. Now a AAA game will be much more engaging than anything a kid can put together.

These days it's become much simpler to put together attractive videos which is why so many kids want to become streamers. Buy a nice camera and a few lights and you can produce media that looked like movies I watched as a kid.


> Now a AAA game will be much more engaging than anything a kid can put together.

Two thoughts:

1) I kinda agree, and it's interesting how many of the founders of game studios who are now retiring, got their start on the 8/16bit home computers as teenagers making games that were the AAA titles of their day, because the upper bounds on game size/complexity were so low.

2) There has never been a better time for free tooling and instruction for kids who are interested in building games. Unreal Engine 5 and the thousands of hours of youtube tutorials for it, is quite a starting point to have. It puts the Shoot Em Up Construction Kit that I would have spent hours in, in the 90s, to complete shame!


1. Yup, they also had the advantage of being able to learn as the industry grew. They were at the forefront of the industry, developing the first 3D rendering pipelines, creating the first textures, scenes, etc. Nowadays it's a lot more knowledge to absorb. I feel the same way about networking. I think any young field is like this. There must have been a time when making bicycles or cars was like this too, when real progress was made through incremental experimentation.

2. 100%. I mean, I know many in my cohort who got interested in graphics programming by playing with Gary's Mod. There's still plenty of ways to make games or programs for friends. I'm still convinced that the kids (with stable/solvent home lives) interested in programming and making a computer do their bidding are supremely capable. I mean today learning a bit of Javascript is all you need to get started on the web.


You can send Swift Playgrounds apps to other people over messages. To pick just one example, from what's often the main target of these "LOL these aren't real computers" sorts of complaints. If you're on a desktop, there are tons of ways, most more accessible than Back In My Day.

I do think we lost something when Flash in particular died, but less apps/games (there are... a lot of very, very indie games made these days by solo hobbyists or enthusiasts, check places like itch.io, it's an overwhelming number) and more the independent animation scene, which was huge during the heyday of flash but seems mostly dead now.


> You can’t write a simple Poke the Penguin app

You just have to write it in HTML and Javascript.


Oh, cool. So I have to learn two languages. And it's going to look like shit if I don't also learn CSS, so three languages. And then there's meta-languages, like frameworks and CSS pre-processors. Writing a web-page that lets you log in and looks halfway cool isn't an inviting project for a ten-year-old.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: