Hacker News new | past | comments | ask | show | jobs | submit login

From the article: "What worries me as much as the end of general-purpose computing for the masses is that so few seem to understand that it is ending."

Correction: Has Ended. The vast majority of people using computers has shifted to phones and tablets. I have a friend who's a millenial who barely knows how to use his desktop PC (It's a Mac) but is fluent on his iPad/iPhone.

Most people are not content creators, they are consumers. The people who are still building their own PC's are mostly enthusiasts. There's no profit in building custom PC's anymore, so local computer shops aren't selling them (And if they are, it's at a hefty premium).

The author of the article calls the 1980's the "Golden era" of personal computing. I disagree. Networked computing is an amazing thing, but it's also been monetized and now the consumers are the product being sold. And happily so. The outliers are those who buy pre-2012 computers so they can skip UEFI and are happy to live under their rocks to do it. That's okay with me.

Personally I need a faster computer than that, so I'm stuck with UEFI and because W10 is the right OS for my needs, I'm stuck with it phoning home. This doesn't mean the golden age was in the 80's or 90's. It means that by the time we figured out how to make computing awesome, we also figured out how to make computing itself profitable, not just the computers. The golden age the could have been, never was.




I don't think "the masses" were ever interested in general-purpose computing. For them it never started.

For people who actually care, I think the concerns are overblown and we'll always have access to a machine that can run open software.

And yes sure, we might not be able to pay consumer prices for them, but they are not going anywhere.

And sure, we might elect a fascist government which will simply ban the technology, or the west will go to war with China making it very difficult to manufacture anything. Unlikely.


This is a major source of cognitive dissonance.

Before social media, the average person had little to no interest in the internet - or PCs in general - aside from email and instant messaging apps. The internet has changed, but it's 99.9% because more people are using it, and their interests and needs are best served by 'it just works'.

This change, along with progress in general, has also made life better for technical folks. General purpose computing is dead? Come on. Desktop Linux actually works. Municipal fiber is a thing. Capable enterprise hardware is available online for less than a second hand iPhone, and if you don't have a couple of hundred to spend, a VPS costs less than a hamburger.


There's a large chunk of enthusiasts between "the masses" and the hard core of people who "actually care" enough (and are rich enough) to go out of their way to have their own machine at home no matter what short of it being illegal.


No self-respecting author would write "We'll always have access to typewriters without built-in censorship. Sure, we might not be able to pay consumer prices for them, but they are not going anywhere."

What makes you so sure the restrictions will stop somewhere you're comfortable with? What happens when unlocked machines are only available to "enterprise" (not unlike a certain Windows version), and you have to agree to audits when buying them? What chance do free operating systems stand, when 99% of users don't even have access to computers that could run them?

We need not worry as long as somewhere, in some dark basement, a sysadmin is able to assemble an unlocked personal computer from his company's old hardware?


If unrestricted machines are restricted only to people with money, it perpetuates our unjust society that creates a prison-like, behavioral nudge, policed, and restricted experience for the poor and a free democratic state for the rich. This is just another manifestation of the economy producing this result via individual consumer choice that manages to simultaneously enrich everyone at the top.

The way to stop this is through legislation and protest, but it's a battle because the state likes locked down computers too.


I guess I'm confident that an open market will always provide for us. I don't know why you think it would not.

Right now, today, it has never been easier to just go down to the local mall and buy a general purpose computer, and sure, there are lots of phones, tablets, and consoles on display as well, but there is no less computers as a result.


Good luck getting LSD or uranium-235 on the open market. You can still buy these, but only illegally. The market is slightly less open due to laws and regulations.

These regulations can easily outlaw general-purpose computing on the same grounds of being too dangerous for the society. I can remind you how encryption algorithms were considered "munitions" back in 1990s, less than 30 years ago. At that time one would possibly say that it will always be possible to bring a small bottle of water onboard of an airplane. Betting on things always being "naturally" available, without a conscious effort to sustain them, is a fraught enterprise.


They are still considered munitions. That law passed just fine. So now there's someone re-implementing the algorithms in Europe because they can't be exported.


I did concede in my original comment that a fascist government could make them illegal. I think it's far more likely than them just not being economical to make any more and disappearing from the market.

If we continue to give power to right wing extremists, all bets are off.


In this regard, left-wing extremists dislike freedom as much.


> I don't know why you think it would not.

The percentage of unlocked computers for consumers has been falling. Consoles, tablets, smartphones (all iPhones and many Androids don't allow rooting), smart TVs... even some laptops. Surface RT laptops had locked secure boot, and Chromebooks made it very difficult to install alternative OSes, as far as I know.

As for the incentives of the "open market", there's profit in locking down devices, and selling you back fragments of that locked functionality. For example, nvidia pushed an EULA that forbade using their consumer GPUs in datacenters, making them buy the far more expensive enterprise versions.

With how markets tend towards oligopolies and monopolies, there's no guarantee the handful of manufacturers left won't conspire to put restrictions on all computers. Especially if 95% of consumers don't care about being able to run gcc.


> I don't know why you think it would not.

I don't know why do you think it would? Both software and hardware-side, there's no end of complaints that power users are being left behind. All the money is in making Fisher-Price software for the masses. Ain't nobody have time to cater for ergonomics or efficiency in computing, it's only a distraction from milking the general population.

This, to me, is a clear example of the market underserving a customer segment.


Fortunately, businesses still have both plenty of use for general purpose computers and lots of money. PC hardware isn't going away any time soon.

The more worrying thing IMHO is that the software to make use of that hardware is becoming increasingly polarised between what's aiming for big businesses and what's aiming for consumer drones.

The thing that gives me some hope is that the geeks all those businesses rely on to build the software that makes them billions are almost all in the "lost middle" and, given enough time and inclination, are quite capable of creating open software to destroy those businesses.


I run a Linux distro on a Macbook Pro and I disagree with you.

Sure, I'm as annoyed as you and the author about UEFI, but as long as I can install an OSS bootloader and launch whatever I want, I'm good. It may be a pain, but the tinkering capability is still there. If it wasn't possible to tinker with it, less people would buy that laptop.

Having closed hardware is a problem and it definitely doesn't make me feel safe. We may all be full of NSA backdoors, but to be honest I don't care much about it and I assume a large part of the tech knowledgeable feel the same way. I suspect this is why Open Hardware is not more prevalent in the market: not enough interest.

I don't care about computer companies making more user friendly devices; we can find a way to hack whatever or just start buying laptop parts from China and make our own laptops. What worries me is if the government will decide to ban cryptography or introduce regulations that force me to change how I tinker with my computer.


>This, to me, is a clear example of the market underserving a customer segment.

Sure, there are some spercific form factors that I would love to have that are not available or very rare. I really want to replace my iPad with something that runs linux, for example.

All I'm arguing is that, I don't believe that in my lifetime, or the foreseeable future, will there ever be a time where I can't reasonably easily get some kind of desktop computer and run whatever code I want on it. (Unless those computers literately become illegal - like some code already is)

I might not be able to carry it in my pocket, or wear it on my face, and it might not have some holographic interface that has not yet been invented yet.

I'm also not saying we should ignore the the fight for general purpose machines, and I will continue to vote with my wallet, as well as lobby to have laws that opening our machines.

I think its a disgrace that Apple no longer supports my iPad witch functions perfectly well, but I can't install anything on anymore.

Society would be much better if all our phones were open, I just don't think we need to freak out about desktop computers disappearing.


> I have a friend who's a millenial who barely knows how to use his desktop PC (It's a Mac) but is fluent on his iPad/iPhone.

This worries me not because of any open computing concerns, but because of how FUNDAMENTAL general purpose computers are to nearly every decent paying professional job out there. All of them. No one daily drives Excel or Salesforce One or edits video or codes websites or balances accounts in Quickbooks on an iPad. Literally no one; we're decades away from that being mainstream, if it ever happens.

Everyone needs to be computer literate. Being a wiz on your iPhone or iPad does not, in any sense of the word, make you computer literate. Kids growing up with these closed-off devices quite literally destroys the fire of desire for learning how computers work, or even accidentally gaining a glancing knowledge with them enough to hold a good office office job.

Companies like Apple endlessly mouth their desire to get more kids to code, then their arms release devices so closed off that even trying to take a quick peek at how they work triggers eighteen system integrity protection alarms. You can't have it both ways, Apple.

Well, whatever. I don't really care. It just means more ultra high paying jobs for me. Apple (and Google, and to a lesser degree Microsoft) are literally actively sabotaging their own talent pool, guaranteeing that they'll never be able to hire enough people while paying the ones they do hire a ton of money.


> Companies like Apple endlessly mouth their desire to get more kids to code, then their arms release devices so closed off that even trying to take a quick peek at how they work triggers eighteen system integrity protection alarms. You can't have it both ways, Apple.

Apple C-Suites: "We want kids to code."

Apple Ads: "What's a computer?"

Whilst it might be me, somebody somewhere doesn't understand something.


It's not an all-or-nothing. One side perpetuates ignorance to keep consumer prices high; the other balances out to produce just enough technicians to keep wages low. They don't need everyone to be a technician, but they do need everyone to be a consumer.

If you think of it as a firewall around knowledge, it's basically like this set of rules:

  Deny: all

  Allow: <specific group>


Now I know it's me missing something:

How does limiting the supply of technicians lower their wages? Is it because Apple is limiting their own supply of available technicians? Why can't a given technician go elsewhere -- is that because the "don't poach treaty" never stopped or something else?


I think you're underestimating the convergence of tablets and PCs here.

I edit 4k video on my iPad, edit pictures, memes, do a bit of drawing with Procreate. I'm by no means professional at any of these things, but it isn't the tools holding me back.

I don't see any reason not to use Excel or Numbers or what have you on an iPad with the keyboard. I'm sure it would work fine.

I know a couple professional music producers who do the bulk of their work on a tablet now.

What hasn't made it to tablets is coding environments. Codea and Swift Playgrounds feel like toys to me.

But it's only a matter of time. I think "decades, if it ever happens" is entirely too pessimistic, it's more "now, if you want it", for most of those things.

And it's definitely not "literally no one". It's more people than you think.


Samsung DEX was a good attempt.

If it were possible to multitask Excel or Word in Android, that's probably what I would be using right now for a lot of my basic workflow.


I agree with your point, but remember that the average person ten years ago was just as technically illiterate.

We look through rose colored glasses because, back then, we were able to use Excel, Word, (insert your program of choice). The people in our social circle were also able to use them. However, this was almost entirely because of who we hung out with.

The shift to closed devices (may) eventually have an effect on hobbyists, in the same way that the average person no longer knows how to change the oil in their car, but for now there's little evidence that iPhones and tablets are limiting the number of young, competent programmers (or even the number of IT hobbyists in general).


> Most people are not content creators, they are consumers.

And using locked-down devices will condemn them to that for the term of their natural lives.

But actually I disagree. At different stages in their lives, people do different things. Make music. take photos, and wonder why they suck. Make family trees. Various hobbies, that require working with multiple bits of sotware on on multiple projects.

Take electronics. It's a hobby. It's nuts to use apps that each store their data in their own sandbox. One app to draw a schematic diagram, another app to plan the hardware layout, another app to record calibrations and test results, another for photos--and other apps to record the project goals and plans, and the lessons learned.

If I couldn't use a general purpose computer and store all of these files (and others, such as manufacturers' datasheets for critical components) in a single folder for each project (not part in one app, and another bit in aonther app, and so on), I'd be using paper, a drawing board and a filing cabinet.


I think what happened is that smart phones took over because people who were not using PCs started using smart phones, and so it seems that phones took over but I think it just tapped into a gigantic market that wasn't really touched by desktop PCs, whose users are mostly professionals, power users, gaming nerds, etc. These groups like the most powerful computing options, and still do but the there are way more non-technical people using a computing device now than before.


> What worries me as much as the end of general-purpose computing for the masses is that so few seem to understand that it is ending.

I think the author would have better phrased his comment: "What worries me as much as the end of general-purpose computing for hobbyists and hackers is that so few seem to understand that it is ending." reply

> Correction: Has Ended.

I guess I have a different definition of "General Purpose Computing" than you or the author, because personally I see the iPhone and iPad as the beginning of general purpose computing for the masses. The traditional PC was computing for the elite/ computer experts.

This romanticism of 80s computing as the golden age completely mystifies me. The 80s/ 90s were a phenomenally fun time for geeks like me/ us. For everyone else, a PC was a thing you had to learn a bunch of esoteric bullshit to use.

"For the masses", the iPhone, Android, and the iPad are far more interesting and useful.

Even if you look at more specific PC style computing, most people don't care about whether their bios is locked or not.


> Most people are not content creators, they are consumers.

I don’t think this is true at all, everybody that has a Twitter / Facebook / Instagram / YouTube / TikTok account is a content creator. There’s a reason the term “browsing the web” has died out, it’s not about passive consumption anymore.

Writing code is just about the only thing you can’t do on an iPad. And while I think we can all agree on HN that having a gateway to programming is super important, you can’t expect most people to care.

Typed on my iPhone. In bed.


> everybody that has a Twitter / Facebook / Instagram / YouTube / TikTok account is a content creator.

All the fora you mention are characterized by lack of longform text. What people are creating today tends to be image-heavy and eschew any long argumentation or prose storytelling.

Having just a smartphone obviously represents a limitation. For example, I am a member of some niche travel communities. A decade ago, when it was common for people to lug a laptop, there was a healthier community of people writing detailed blog posts on certain destinations, editing Wikivoyage or Hitchwiki, etc. Now that those same travelers are leaving the laptop at home and traveling only with a phone where it is inconvenient to enter much text, the ecosystem of travel resources is actually poorer than it was in the past.


I see a lot of long form medium articles these days. It’s not that they don’t exist. They are just very bad.


Definitely agree with this. Twitter and other social media has really forced the short 140 character snippet or image to be favored over thoughtful posts and blog articles.


Yes, long-form text is down and photo/video based content has exploded — things are different but I’d hesitate to say that they’re worse overall. I used to share a computer with 5 other family members growing up in the 90’s, nowadays every kid has a content creation device on them 24/7. That seems like a good thing to me.


You can totally write code on an iPad. There are plenty of JavaScript environments, not to mention Pythonista.

Swift playgrounds will actually run quite large codebases with a little fiddling around.

You can trivially get a robust Linux command line with Blink and a free tier Google compute instance.

I agree there is no XCode.


The iPad coding experience is decidedly sub-par, and has a long ways to go.

This is in contrast to almost anything else you might care to do on a computer, where the iPad equivalent ranges from "basically fine" to "actually better".

But people's perceptions are lagging a few years behind where the platform is at. That's normal.


I mean, sure, I can technically code on my PS4 too using the web-based Digital Ocean console.

I didn’t make the distinction between writing code and writing software but I guess it’s an important one.


It is an important distinction.

I think you’d be surprised how close swift playgrounds has become.

All it really needs is a project navigation UI, and it will be good enough for building pure Swift/SwiftUI apps.


> There's no profit in building custom PC's anymore, so local computer shops aren't selling them (And if they are, it's at a hefty premium).

Most (all) shops I have seen will sell you custom PCs for the price of the components. Unless you are talking about Entreprise ones (not sure if they do exist), most of these cater for gamers who are price-sensitive. But the components are the same, and you can pick an aluminum case instead of the RGB ones.


On the other hand, 30 million Raspberry Pis have been sold and an enormous number of people are learning to code as a result of free online resources. I'm not so sure the end of general-purpose computing for the masses is all that over.

Even your example describes someone who, while he isn't fluent with it, actually owns a general purpose computer.


There's 2 factors, medium and platform.

PC is better for long text. Phones are better for photo / media. Both have pros and cons.

There's also the platform. A lot of phone content ends up siloed into auto-suggesting apps, where click-bait (touch-bait?) tends to just overrun people.

People can now sit for hours just swiping on auto-suggested content. The same can be true for text (see clickbait) but you'd tend to access that through a web browser, where you're just a tab away from something actually useful.


The masses run any program or punch their credentials into any phishing site you send them. I'm fine with them being restricted to the iphones and androids of the world.

That's where it's going to go, the security situation is not getting better. You already see it in a lot of political organizations where many of the workers get by with chromebooks. And for joe schmo, that's FINE.


Android is still general purpose. I have Termux if I need to compile and run something. I can sideload APKs. I run Windows on desktop for newer games, but quite a few work on my Ubuntu laptop too.


Unless your overlords benign be: "Nein!"


Only if you root. General purpose computers can run alternative OSs.


To be fair, quite a few Android phones have unlockable bootloaders. You go to the manufacturer's website, download a utility or request a code, and you can then also install whichever OS you want.


In my experience realistically even if you can unlock the bootloader, the hardware is still so proprietary that you have no hope of ever installing an alternative OS it because by the time you reverse engineer it all the phone is already 10 years old.


In my experience it usually goes just fine. I’m actually not totally sure what you’re talking about, what does the hardware have to do with it if you’ve got the bootloader unlocked? Assuming you’ve got something common like a Qualcomm SoC, it’s pretty straightforward to take codeaurora or whatever source and AOSP and 90% of the time have an almost fully functional custom image for your device.

Now granted I’ve rooted every phone I’ve had since the HTC Incredible, and I always check XDA forums before buying a new one to see if there’s a decent development community. And in recent years there has been a SHARP drop in the % of manufacturers that even allow their bootloaders to be unlocked - I don’t think the last like 6 samsung galaxy’s have ever been possible to do it. But there are still plenty, I’ve managed to find a kickass flagship model every year or 2 years and upgrade through at least 2 or 3 versions of android (I always buy last- or last-last-gen models used for <$200 so they’re typically a bit out of date already).

I couldn’t imagine any other way honestly. I absolutely love rooting & flashing custom roms, and it’s kept all my phones practically as young and snappy as the day they originally released...or better


There are different definitions of alternative OS. I'm guessing yours is a different build of Android with the original kernel blobs transplanted from the OEM rom. The other one is being able to run mainline linux or a custom OS from scratch.

If Linux didn't already exist it would never have been possible to create since all computing platforms are getting more and more locked down.


Is there any phone with full open source support? So you can compile Android and run it without any proprietary driver blobs.


Yes. The newest device is the Samsung S3 (2012) https://redmine.replicant.us/projects/replicant/wiki/GalaxyS...

You can also run mainline linux on the Nexus 7 2012 but you lose wifi support.

Most alternative OSs just use the patched kernel that shipped with the device.


That's a little bit sad state of affairs. But better than nothing, thanks! Latest version was released almost 3 years ago. I wonder if security being taken seriously? It seems that security holes are found more often than that.


I use a modern desktop with Arch Linux and KDE. I stopped using Windows ~10 years ago and not looking back.


You can easily disable W10 telemetry and all the other nonsense they put in it.


And they turn it on after updates or ignore the setting or nag you every day to turn it back on. Every time I boot back in to windows I get a full screen message about tying my account to an online one.


Weird. I don't get that. I used https://www.w10privacy.de/english-home/




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: