Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What a whinge.

> Being a digital native doesn't automatically mean you're computer savvy.

What it actually means is that these generations operate at higher levels of abstraction. Even the "computer builders" of the past couple of decades merely snapped together lego bricks designed by others.

And anyway, in the "old days" you spent at least as much time looking after your system as actually using it (I was shocked how my PC friends had disk optimizers and anti-virus and whatnot). But back then it was part of the fun, as it is for hams.

But kids these days have too many important things to do than to become computer hams.



> What a whinge.

Strong disagree here.

>What it actually means is that these generations operate at higher levels of abstraction. That may be true, but I don't think it's necessarily a good thing. In the analogy of a car driver, if you don't know how the car works aside from driving it, then you are worse off in many respects. If it behaves oddly, you have no idea what is wrong, no idea how to fix it, and (in my experience) are a lot more likely to pay significantly more to do so.

For many today, computers are a complete mystery - indeed I'd hazard that proporptionally far more people know nothing concrete about how computers work than did when I was a kid. They have become magical devices that seem beyond comprehension for many, and I think that's to the detriment of everyone.

>And anyway, in the "old days" you spent at least as much time looking after your system as actually using it

Also not true. Didn't spend -any- time looking after my ZX Spectrum. Just turned it on and either started writing software straight away, or loaded a game up. Maintenance was not a thing.

>But kids these days have too many important things to do

That has certainly not been my experience as a parent. They have lots of things to do, but I don't think much of it is important. It's just attention-grabbing.


By the same token we could consider watching TV just a higher level of abstraction than being an amateur radio enthusiast. It would be technically correct while completely absurd in the real world.


>> But kids these days have too many important things to do

> That has certainly not been my experience as a parent. They have lots of things to do, but I don't think much of it is important. It's just attention-grabbing.

I think those "important things to do" means "extracurricular activities", that probably isn't so important to you or to me, but we know other parent that doesn't let their children have any time for themself due to those activities, they are.

We're talking about learning a new language, learning to play some musical instrument, or even playing some sport in a local team. Both now and when I was a child I know and knew a lot of parent who stressed a lot, and stressed a lot their child, because they didn't sense them to advance in those activities and pressured them a lot to improve in that.


> I'd hazard that proporptionally far more people know nothing concrete about how computers work than did when I was a kid.

I think that's only true if that "proportionally" means "proportional to the number of people who have to work with computers on a daily basis".

I don't know exactly when you were a kid, but from the ZX Spectrum comment, I'd guess it was in the early 1980s. I guarantee you more people, both in absolute numbers and in proportion to the total population, know concrete things about how computers work now than did then. Huge percentages of the population of the world had never even seen a computer when the ZX Spectrum was current. Computer Science curriculum (and related fields) was still, relatively speaking, in its infancy, and the number of institutions that even had anything that could be reasonably termed a functioning CS (or, again, related) department was fairly small.

Today, yes, there are a lot of people who know very little about how computers work, even though nearly everyone (especially in the Western world) uses computers on a daily basis. There are even a lot of people with tech-related degrees who don't know the full ins and outs of how computers work even at a medium level of abstraction.

But there are so many more people who have had the opportunity to learn about computers because of their ubiquity. There are so many more people who have, either through formal education or otherwise, learned how an operating system loads drivers, or how a file system manages space, or how a program allocates and frees memory, than in the early 1980s.

I believe what you are seeing is the difference between a world where, when there was someone you could talk to about computers at all, they had to know how they worked in order to use them effectively, and a world where computers are so widespread everyone uses them, and so user-friendly that the vast majority of people never need to know or care how to do anything remotely like changing the jumpers on a SCSI drive or the dip switches on a sound card.


I don't think this is true, at least in the UK. The BBC computer literacy project meant that there were BBC Model Bs in every UK school, with tie-in educational materials. In the early 90s my entire class at a run of the mill state primary school took turns pair-programming LOGO on the school's BBC Micro. I don't think you can get more ubiquitous than every single human having programmed a loop and subroutine.


> For many today, computers are a complete mystery - indeed I'd hazard that proporptionally far more people know nothing concrete about how computers work than did when I was a kid. They have become magical devices that seem beyond comprehension for many, and I think that's to the detriment of everyone.

Yeah but did you understand how RAM chips worked? How processors worked? How registers and power supplies worked? Could you wire wrap a board? The previous generation of computer users knew how those things worked and often wired them up themselves.

Most of these arguments are emotional. We, as computer practitioners, are dismayed that the general public doesn't value our knowledge. But should they? Are the kids who actually want to learn about computers not able to learn about them?

I grew up fairly poor as a kid in the '80s-90s and most of my friends didn't have access to a computer at home. They learned the bare minimum they needed in the school library to finish homework assignments but otherwise didn't care. I was interested in computers and would go dumpster diving to find parts, but my friends didn't care. I'd say the cohort that had the money, time, and inclination to have computers as kids in the 1980s was much smaller than those who want to nowadays.


With respect to cars though, they are more reliable and problems are almost certainly harder for them to fix on the road or in the home garage.

It's probably useful to have some notion of how cars operate in general. But I suspect fewer and fewer have deep knowledge and certainly the ability/interest to do their own auto repairs of any consequence.


> my ZX Spectrum

I'd hazard a guess OP meant early personal computers with something like Win 95, not underpowered microcontrollers.


By the time Win 95 rolled around, personal computers were well past the early stage, and the Z80 was neither underpowered nor a microcontroller. So I’m not sure what you’re saying, exactly.


I'm saying that even an Arduino is faster than a ZX Spectrum, so yes I would place it firmly in that category as it wouldn't exactly be able to run anything advanced enough to warrant maintenance.

> By the time Win 95 rolled around, personal computers were well past the early stage

That's a matter of perspective really, in 100 years even what's modern today will be considered early.


Things are mostly fine as they are(Aside from the rise of locked down, mandatory encrypted, cloud dependent stuff), but people definitely have lost some computer literacy.

Not that they lost much that they couldn't google, even programming is easier than ever, the hard stuff is specialist work like OS and hardware design. It's not like we need average people to know ASM and C.

But people do seem to have lost some of the interest. As I've said before, programming is easier than most other human activities, things like playing guitar or being a cashier are not only hard, they require skills that can't even be described fully.

But programming/IT/etc still takes time to learn, and I do think people probably are somewhat fried from all the short form content and less interested in anything that has a slow and careful process.


Programming might be easy but software development is incredibly difficult. The average person will be hundreds of times more successful picking up a guitar or being a cashier than releasing a useful piece of software.

Software development is incredibly difficult and a lot of people who are employed programmers aren't very good at it.


Picking up a guitar might be easy; playing an instrument well is hard. They say it takes ten years; I'd say it takes about that long to be a good software developer. And like being a musician, most people bail out before they've learned. A few people stick with it and still don't get good (the management promotion route is this way!).

Let's pass over the cashier thing; I was a cashier exactly once, on an exchange trip, so in a foreign language. I got the hang of it in about 20 minutes.


Writing stable, production-ready software is a grind. The personalities in software teams are often grumpy. I know many junior engineers who loved writing code and were dismayed at what went into writing production-grade software. Some of them moved to startups where the stakes were lower or they could work on MVPs, many transitioned into roles like PM or sales engineer, but a decent chunk just left software altogether. I love software so I won't leave but I know it's not for everyone.


When I upgraded from an Amstrad CPC to an Amiga in 1989, I gained access to a higher level of abstration, like you said. Unlike the Amstrad, the Amiga had a GUI and a proper command line.

But unlike the information kiosk-esque devices of today, my Amiga let me navigate its layers of abstraction as I saw fit. These layers represented another dimension of usability. They were meant to be used and navigated.

Today we commonly think of a new layer of abstraction as if it "saves us" from the scary stuff underneath. We pave over what's underneath and pretend we are still standing on the ground. (Web interfaces are a good example. If I want to make a custom web widget I have to make it using form elements. If I am making a new widget in a desktop GUI, I make it using the same drawing and event primitives that the built-in widgets use. My new widget then sits at the same level of abstraction as the built-in widgets, and the overall system is preserved.)

Even modern desktops suck in this regard. Navigating the layers of abstraction is like being an archaeologist uncovering scenes of historical catastrophe in the folded and shattered strata of our systems.


I would disagree, that operating a computer or computerized system is not necessarily being computer savvy. That is what the author is trying to say.

I have long felt that when you have experienced the lower or lowest levels of abstraction, even in archaic times, you develop an intuition about how a thing works or how a problem might be fixed, or how to properly scrutinize or be suspicious of a particular new form of tech (itself often implemented at these higher levels).

One example would be diagnosing "bad internet" by evaluating the wifi signal strength & interference from similar channels, the cable/fiber modem basic functionality and firmware version, the cable quality and function, the networking card in the computer, the function of the software behind it, the browser behavior, caching, or getting "wedged" somehow, the DNS resolver or its cache, an OS bug... it goes on and on.

The flip side of this is: everything is fricking multilayered and complicated! And it sucks! But that's unfortunately what we've got when you have so many layers involved.

Disclaimer: I'm firmly in the demographic described in the article.


I mostly agree with you¹, however, it is a problem if the current computing environments does not actually allow people to do anything of their own. Just like 80’s computers had BASIC, there should be Flash-like authoring programs which people could write shareable programs on. But there aren’t! Specifically the sharing part, I mean. You can’t write a simple Poke the Penguin app and send it to your friend for fun anymore. You’re at the whims of enormous gatekeepers and censors.

1. https://news.ycombinator.com/item?id=19106922#19113359


There's environments like Scratch, Jupyter notebooks, Minecraft, and even Roblox where kids definitely build experiences for each other. The problem is, standards have gotten a lot higher these days. When I was a kid, I fooled around and made RPGs that looked similar to the ones that my friends played on home consoles. Now a AAA game will be much more engaging than anything a kid can put together.

These days it's become much simpler to put together attractive videos which is why so many kids want to become streamers. Buy a nice camera and a few lights and you can produce media that looked like movies I watched as a kid.


> Now a AAA game will be much more engaging than anything a kid can put together.

Two thoughts:

1) I kinda agree, and it's interesting how many of the founders of game studios who are now retiring, got their start on the 8/16bit home computers as teenagers making games that were the AAA titles of their day, because the upper bounds on game size/complexity were so low.

2) There has never been a better time for free tooling and instruction for kids who are interested in building games. Unreal Engine 5 and the thousands of hours of youtube tutorials for it, is quite a starting point to have. It puts the Shoot Em Up Construction Kit that I would have spent hours in, in the 90s, to complete shame!


1. Yup, they also had the advantage of being able to learn as the industry grew. They were at the forefront of the industry, developing the first 3D rendering pipelines, creating the first textures, scenes, etc. Nowadays it's a lot more knowledge to absorb. I feel the same way about networking. I think any young field is like this. There must have been a time when making bicycles or cars was like this too, when real progress was made through incremental experimentation.

2. 100%. I mean, I know many in my cohort who got interested in graphics programming by playing with Gary's Mod. There's still plenty of ways to make games or programs for friends. I'm still convinced that the kids (with stable/solvent home lives) interested in programming and making a computer do their bidding are supremely capable. I mean today learning a bit of Javascript is all you need to get started on the web.


You can send Swift Playgrounds apps to other people over messages. To pick just one example, from what's often the main target of these "LOL these aren't real computers" sorts of complaints. If you're on a desktop, there are tons of ways, most more accessible than Back In My Day.

I do think we lost something when Flash in particular died, but less apps/games (there are... a lot of very, very indie games made these days by solo hobbyists or enthusiasts, check places like itch.io, it's an overwhelming number) and more the independent animation scene, which was huge during the heyday of flash but seems mostly dead now.


> You can’t write a simple Poke the Penguin app

You just have to write it in HTML and Javascript.


Oh, cool. So I have to learn two languages. And it's going to look like shit if I don't also learn CSS, so three languages. And then there's meta-languages, like frameworks and CSS pre-processors. Writing a web-page that lets you log in and looks halfway cool isn't an inviting project for a ten-year-old.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: