Considering all the discussions about JavaScript and sites with primarily text requiring it, this looks like the more things change, the more they remain the same :-P. Also see Wirth's Plea for Lean Software[0] from 1995 for another "timeless" issue.
I did find this bit interesting too...
> Apart from this practical reason, there's a principal one: The first time I invoked Netscape, it said that it is obsolete and refuses to work. I don't use software that thinks it knows better than I when I should stop using it.
...considering the modern trend for autoupdating software. The author (after this paragraph) also considers availability, but another issue is if the software is something one would like to use even if it is available - for instance, personally i never liked using a version of Paint Shop Pro after version 7 since i found all of them a degradation. I can use PSP7 just fine though (even on Linux via Wine) - imagine if the software decided by itself that it is too old to run or to replace itself with a new version against my wishes (this is something a lot of software does nowadays).
From a user's perspective this also has implications on preserving backwards compatibility for foundational functionality programs rely on.
Firefox on Ubuntu (which I'm not using anymore for these and other reasons) and presumably on other OSs does exactly that: refuse to open new tabs/sites once every two weeks or so, telling me "one more thing we need to do" ie self-update. So you're loosing all your browsing context, though FF does a better job restoring it after restart than it used to, but still that's user-hostile and self-important as fuck. And there's also a "principal" reason why I can't stand this: that the Web is now 30 years old, past its peak, so if browsers still need to update bi-weekly for new features and experiments, this in combination with lack of browser diversity proves without any shade of doubt there's something very wrong with the incentives for browser development, the Google/Mozilla browser cartel, and the evolution of "web standards".
Nope, it only happens when something outside changed Firefox's files (in this case when Ubuntu swapped files because dpkg updated Firefox). This never happens* in Windows and macOS (it might nag, but you can definitely dismiss it). It seems that johnchristopher has a suggestion to disable auto-updates on a specific application in Debian and Ubuntu (haven't tested it though): https://news.ycombinator.com/item?id=33202052
* At least using their official installers. I'm specifically excluding using Chocolatey/brew/other loose-file update mechanism or your bonkers enterprise solution insists on using loose files and not rely on *.msi/*.app/*.pkg.
This is exactly right. And this happens because it needs to load libraries or exec subprocesses that aren't compatible between versions. Since the matching version of the file has been replaced with a newer one it doesn't really have a choice, it is unable to launch the new tab. The nice message is a better alternative than crashing.
This also doesn't occur on NixOS because the new version is in a different direcotry and the old version is kept until it is garbage collected.
> This also doesn't occur on NixOS because the new version is in a different direcotry and the old version is kept until it is garbage collected.
Also with flatpak; you are using the old version until you close it. Then it will be garbage collected. On the next launch, you will be running a new version from different root.
This is 100% something that can be fixed by the browser. Package updates don't change existing files but rather create new files and update the filesystem to point to them. All it takes to not need a restart is to keep file handles open for all needed resources and forking from such a process instead of exec()ing new ones. Not entirely trivial for applications as monstrously complex as browsers but hardly comparable to other challenges they need to solve. It's only because they can control the update process on the OS they care about (Windows) that they don't bother.
If you download the linux version of Firefox from mozilla.org you get a tarball that you can extract and run Firefox from without needing to do anything to install it. When I've run it this way it self updates the same way as it does on macos or windows.
> Nope, it only happens when something outside changed Firefox's files … This never happens* in Windows and macOS (it might nag, but you can definitely dismiss it).
Happens with multiple Firefox instances too, `firefox -P -no-remote` (I think in recent versions the -no-remote is redundant), even on Windows with official installers. In that case you don't even get the error message; new tabs just remain blank. There might be a delay between process A's update and symptoms appearing in process B; not sure.
> the Web is now 30 years old, past its peak, so if browsers still need to update bi-weekly for new features and experiments, this in combination with lack of browser diversity proves without any shade of doubt there's something very wrong with the incentives for browser development, the Google/Mozilla browser cartel, and the evolution of "web standards".
Or maybe new features are still coming out in the W3C specs and need to be implemented. Did you know CSS now has a parent selector, or that JS will be getting functional piping soon?
On functional piping, this has been in the pipeline with a few competing proposals for a while (I prefer the F#-like version myself). Will be surprised if/when it actually makes it in.
Yeah that is annoying, though personally i find browsers as the one category of software where i think this is fine, mainly because of security updates (it sucks that they tend to come with UI updates but at least on Firefox so far the main UI is fairly customizable and have made my own) but also because they are online software anyway.
Does a browser really have to be so complex that it warrants updates more often than the very sites that are being browsed? Contrast this with the design of idk MP3: a relatively simple and ultra-stable decoder app with a large variety of backend pipelines that can create MP3s. That's how the web pre-JS, pre-CSS was like. Or, with a perspective from information theory: downloading hundreds and hundreds of megabytes again and again (browsers), then consuming insane amounts of energy to access information that hasn't really changed all that much isn't very effective, is it?
>Does a browser really have to be so complex that it warrants updates more often than the very sites that are being browsed? Contrast this with the design of idk MP3: a relatively simple and ultra-stable decoder app with a large variety of backend pipelines that can create MP3s.
The problem is the recent trend that everything has to be a web application. So browsers aren't just to access information anymore, but literally to do everything else too. I personally don't agree with the web application trend, but this is the reason why a browser is so much more complex compared to an MP3 decoder: the decoder has to do a single thing, the browsers have to do more and more things.
Even your simple example, MP3 decoders, that do one thing, have had code execution exploits and other security issues over the years. WinAMP had CVEs for it.
All software will have bugs. I want my fixes fast and often to an environment where I run untrusted code from that many places.
Does it have to? I don’t know. But it is. It’s an operating system where 3rd parties execute random code on, and you hope it stays sandboxed. Those websites? Thanks to ads, they don’t update once in a while, but usually once every few seconds.
I just download it from Mozilla and install on my home dir. This solves both the issues of Debian not updating Firefox fast enough and of them updating it too fast.
This is infuriating to me as well. And the stupid Firefox restart dinosaur always comes up at the worst possible time. I would have switched to a new browser years ago, but Firefox seems to be the only one that supports vertical tabs.
Maybe I'm missing something or set some settings years ago and have forgotten but I have zero problems with restarting Firefox after an update from my package manager.
I regularly have 2-4 Firefox windows with dozens of tabs in each and a "You need to restart" button press takes like 4-5 seconds max to close all windows and reload them all with all of my tabs as they were. The most I have to do is stick each instance on the right workspace.
Granted each tab will reload when opening it but if I'm updating my OS packages I'm probably not exactly "in the zone".
I am using opera with “tree tabs” extension, but afair there is a similar (or the same?) extension for chrome. Although they do not hide the horizontal tab bar.
Edge is not free. I wouldn't seriously consider using a non-free application for something as essential as everyday web browsing.
You can't really have usable vertical tabs in Chromium via plugins either, unless you're content with wasting a lot of horizontal space for an ugly sidebar and vertical space for uselessly duplicated tab bar.
Firefox is the only actual choice I'm aware about.
Using Firefox definitely supports the existence of a free browser. Loss of market share is the #1 threat to the continued existence of a free browser. Beyond the obvious (if a tree falls in a forest, crushing the last copy of the code for a browser that has zero users, then was it a browser at all?):
lower market share =>
nobody testing against the free browser or fixing site breakage =>
quirks (bugs, underdefined specifications, nonstandard features) of other browsers becoming required for a functional Web =>
free browser is no longer a browser of the actual Web.
Being controlled by corporate interests is completely orthogonal to being free. A lot of Free Software is being controlled by corporate interests and there's nothing wrong with it.
Given that the issue of Firefox being forced to restart primarily happens on Linux, I doubt Edge is an option for them. Though I have to concur that Edge has one of the most stable and smooth vertical tab implementations around, most of the plugin-based ones are more fully featured but much less reliable.
I use it, and it's decent. And more in the vein of "it's not google" though I do slightly prefer the chrome dev tools to the modifications that Edge has made. I don't like a lot of the "helpers" for shopping though. And definitely don't like the article wall with ads that are really hard to block/script out.
The first time it happened, I would get rid of whatever software was involved in causing it and never use it again, except for testing purposes.
I completely agree with you about everything else too. The Web is mature enough that I can use a well-tested website with a 20-year-old browser.
There is no technical reason to not have a minimum-viable web browser with a smaller attack surface that doesn't need upgrades for months or even years.
And, in fact, such browsers exist, and I can browse most of the Web that I need with them. I just have to ignore the shitty mainstream, which I am more than happy to do.
As someone else pointed out above. This is not due to Firefox per se but due to the OS changing dependent packages. I understand if you install Firefox directly from their website rather than through the system package installed you won’t have that behavior. It does not happen on other OSs.
Firefox on Mac. I have this problem as well, even with "autoupdate" set to true. Also, if I am browsing in the middle of an auto update, it refuses to open new tabs.
And each update logs me out of my password manager
Worse, now this is happening to Thunderbird. Wasn't an issue until about a month ago. Now I need to re-install Thunderbird every few weeks because they've pushed an update.
When from my point of view, as a user, I haven't seen a new feature which I truly wanted in over 5 years, maybe more like 10.
I use FF on Mac OS and have never seen it refuse to operate due to an update. It will sometimes popup a dialog letting me know there is an update but I can dismiss it. I tend to check for updates every couple of weeks, anyway. The update process is mostly painless though it does restart the browser. That is probably a good thing anyway. Most updates I get when I do update the browser for other reasons and it installs and update as it is restarting.
That sounds truly horrible. I've only recently returned to Mac OS as primary OS (one reason being annoyed by Ubuntu and FF) but why aren't you using Safari and Mail.app then if I may ask?
That this complaint is pointing at Firefox when it's not Firefox at all, is interesting. How many things do we associate with one thing when it's actually something else entirely.
What does then happen? Do I get some nudge to update?
I for one don't really mind the Autoupdate. I can't stand it that it forces me to restart while I am in some workflow. I'd be fine with "you should update, click here, when ready" which I can click five minutes later when done with the task.
This reminds me of iOS apps that as soon as you open them require an update to the latest version and won’t let you proceed any further (in the app) until you click OK/do the update. I always make it a point to go leave a one star review (or update my existing review to one star), and point this out as the reason
> I don't use software that thinks it knows better than I when I should stop using it.
First, it's not software that thinks it knows better. It's whoever maintains that software. Humans. Who read and write the code that makes that software work, and who find problems with the software and fix it.
Now, with that out of the way, and the assumption that someone maintaining software should notify you if the version you have has outstanding problems, yes you should still be in control of making the decision to continue using it, or to get a newer version. But you should be OK with software maintainers having thoughts and opinions about whether specific versions are problematic.
I blame black hats. If it weren't for viruses and exploits there would be a lot less pressure to keep things updated. Remember how much flak Microsoft got for vulnerabilities pretty much ever since Windows got a networking stack? Meanwhile academic networks got along fine for decades running on unencrypted NFS/NIS.
The black hats keep the developers releasing new versions, and the product and marketing teams keep a list of "features" and "improvements" to slip into every new version. Coincidentally most of the bugs and security vulnerabilities are caused by these new features and improvements. It's a vicious cycle.
i don’t want most of my software to auto update but i certainly want the browser, which ingests untrusted data from unknown sources constantly, to get bug fix updates automatically.
Firefox on Ubuntu (on most distros, in fact) doesn't update itself automatically at all, which is why there's no option to disable it. It's apt (or snap?) that updates it.
I like getting a notification from the browser that there's a newer version so to give me a sense how long I'm waiting to get the distro's updated package. I don't use Ubuntu though so I'm not sure if that's in there or not.
+1 on PSP after v7, especially after the Corel buyout of Jasc. It was hands down a favorite before, after it just got worse. Kind of wish I still had my v7 and serial number, though I could probably find it... I've been okay with Pinta and Paint.Net for most things I need such a software for. I still liked PSP better.
> imagine if the software decided by itself that it is too old to run or to replace itself with a new version against my wishes
Why I will never buy another Apple product. Had a great video editing station I would author DVDs with until one day Apple decided I needed to upgrade my OS to continue using the software which of course required a new computer. Apple's dead to me.
I am sensing a greater divide between people who demand more control over the technology they use and those willing to relinquish it.
Push for silent updates, force upgrades, always on tracking, remote bricking, notarization, (corporate cartels) trusted networks, making things harder to open, harder to repair, authorization for everything all point toward control.
I believe another generation of hackers will emerge from this transition.
In find it interesting that Wirth would accept needless complexity as jihad, and then put an object oriented programming language at the heart of his "back to basics" workstation Oberon.
Sweeping complexity under the rug isn't the same thing as removing it.
I still use this strategy, though these days, I primarily use Mosaic and Netscape for testing.
I tend to browse without JavaScript enabled, except for places I already trust to not abuse it. And if there is anything blocking me from accessing the page, such as a modal dialog, a cookie notice, a survey, a prompt to sign up for the newsletter, I close the tab.
Over time, I have found that type of rude lack of consideration for the reader's cognitive load and ability to correlate highly with low-quality content that is a waste of my time to read, so this practice also saves me a lot of reading time.
And every day, my Internet gets better and better.
I like the term "featurism", I'm going to try to remember it.
I feel like the strategy of closing sites with newsletter sign-ups, dialog, etc. is just going to waste your time. If I'm searching for something, I just want the information. Not gonna close websites till I find one that has no popups/modals!!
Popup Blocker [0] and uBlock Origin for Firefox will do a pretty good job of getting rid of the junk you don't want.
Certainly a reasonable choice on your part. We all want what we want.
I have a friend who, when he goes grocery shopping, grabs the first item he sees from each of the items on his list. As far as I can determine, this is without regard to price, quality, or any other factor.
Statistically speaking, your friend should be getting average products from each category in the long run. Practically speaking, however, the assumptions that go into that statement likely won't hold, as supermarkets go to some lengths to make sure that the first item that meets your eye from any category is one of the pricier options.
More profitable for the store. The actual expensive stuff is on the top shelf. Eye level will probably be the store and the national brand which skew average or a little below.
It’s one of the instances where your interests and that of the grocery store are mostly aligned. Grocery store wants you to buy more and rotate inventory not necessarily pay a lot for it.
It may seem that way, but I have found it to be very effective.
In my comment, I was thinking of the context of links I click to from HN.
But it also works when searching for information. When I close the tab of an unfriendly site, which, remember, likely has lower-quality information than an accessible one, I am immediately freeing myself to click the next link in the search results and get my information from an accessible site with higher-quality content.
Given the disparity of the quality of content, it is quite likely that I would have ended up at the latter site anyway.
I concur on using an adblocker to block modals, cookie notices, dickbars, related articles, author profile pictures, etc. In fact, I never dismiss modals because doing so may imply consent to tracking cookies (you never know these days…). I just block them with a quick cosmetic selector-based filter.
Unless it's a personal site/blog I typically open a site to find information I came for, and close it as soon as that task is done. Anything that makes this take longer than it needs to gets blocked. Sites should strive to stay open for as few seconds/minutes as possible while still giving me the requested information.
Regarding text/image-focused sites that require JS: I generally find sites made by people who haven't figured out how to send text over HTTPS to have low-quality content befitting of their low-quality stacks. I'm all the better for using my adblocker to block them from my search results page forever; it saves me time in the long run.
There are pros and cons to the approach. Information from such actively hostile sites tends to be of a lesser quality and of questionable reliability anyway IMHO.
Before closing the tab, I check if View -> Page Style -> No Style gives the text I was interested in reading.
If i arrived on the page, chances are that there might be some information I'm looking for.
Still plenty of websites which run on HTTP. I know mine certainly do.
Security isn't everything, there's also accessibility to think about.
HTTPS breaks in many circumstances when it's not needed, including on current browsers.
You can also use a stripping proxy.
Bigger issues I've encountered with actually using Mosaic is that a) it does not support the Host header, so you must have a dedicated IP address, and b) it doesn't like semicolons in then Content-Type header, which most of today's servers include.
> Security isn't everything, there's also accessibility to think about.
TLS is not just about security. It also protects the authenticity of your content, ensures privacy and even accessibility for your visitors. Without it, any node between your visitor and your server could inject content and JavaScript to do all sorts of nefarious tracking and profiling.
These days there's really no excuse for using plain HTTP on public facing sites. Please rethink your decision.
It's not just your site: an HTTP site allows for injecting HTTP urls for images/css files on other sites, e.g.
<link src="http://target.site/...">
If target.site's auth or session cookies aren't properly protected, this is sufficient for stealing them [1].
And plenty of sites have insufficient protection on their cookies [1].
There are risks and there are benefits. Managing the risks to get the benefits is the game.
My sites only use cookies for user preferences preferences and client-linked session tokens, so I am not too worried.
A different way to mitigate the risks is by only storing information that does not need to be private, remaining small and un-interesting, designing for transparency and auditability, and keeping a close eye on those audit logs.
Actually, there is a very good excuse: I want to access the content I need with the browser and configuration I currently have, and without the ability to alter it.
While the threat of MITM is technically true, thanks to being on a relatively safe networks segment, I have yet to encounter it happening in the real world, except in cases of captive portals, when I actually want it to happen.
I think it is largely an over-stated threat for most read-only applications.
The impact on accessibility, on the other hand, is real and huge.
You must be lucky. My cellphone carrier used to injects scripts and ads on plain http requests. Imagine seeing ads on wikipedia. Now that https is everywhere, this behavior is largely stop.
Are you sure about seeing ads Wikipedia? AFAIK Wikipedia itself doesn't run any ads [1] (except occasional call for donation), so the ads must've come from other source (e.g malicious extensions, malwares that install mitm proxy to inject ads, etc).
There was similar discussion about "provide un-encrypted access to your pages for accessibility" few months ago here at HN under "Consider disabling HTTPS auto redirects" post [1].
I don't know about Mosaic[1], but there are many alternatives out there. I frequently use elinks as a text only browser since it makes an attempt to layout pages (most pages have some form of layout to handle navigation elemnts these days). There are also browsers that handle graphics, but not scripting or CSS, if that is what you want.
[1] I have heard of people trying to get Mosaic to work on modern machines, but I think the efforts were restricted to rebuilding the software rather than adding features.
I’ve been quite impressed by the Gemini browser LaGrange[1]. Does it’s own antialiased text rendering straight to SDL. Automatic site coloring as an option. Would be neat to see something like this with basic HTML 3-ish support (basic tables, lists etc)
Im just saying I think making a new Mosaic would be a fun project. Been watching Andreas Kling do his browser hacking lately and it looks like a lot of fun
I think they're two different, related words. Featuritis, the condition of software having too many features, is caused by featurism, the underconstrained and overemphasized pursuit of features.
Not to mention people in the country suffer for featurism. In 2009 I had dial-up because broadband was unavailable where I was, and it took as long as 12 minutes to load some pages. I have since moved and no longer rely on dial-up, but I can only imagine how long it would take now with all the bloat happening. Honestly not sure how they do it these days.
The comment seems overly dismissive. I believe parent raises a valid point.If we accept that we are consumers and capitalism theoretically expects that one is an informed consumer, it is primarily up to us to determine future shape of the landscape. Naturally, what follows seems to be:
-we are not informed consumers
-informed consumers are a minority
-uninformed consumers are a boon to capitalism
To you specific point, is it ivory tower to know what you want and make decisions that benefit you specifically and not partake in Zeitgeist just because everyone else is?
So the "movement" against modern browser features is not a movement and has nothing to do with these features but its simply a version of "kids these days lost their ways" thinking. A version of conservatism, I guess. "Everything was better in the good old days" and "The new generation is horrible, the humanity is doomed" kind of thoughts are probably a manifestation of fading youth.
It is also possible that stuff are getting worse though. After all i remember friends of mine who were in university raging on how bloated the web was around 2000 and i'm 99% certain they weren't middle-aged men disguised as teenagers :-P. If anything i don't remember knowing anyone above 25y back then who even cared about web stuff.
After all having some people see and complain about things getting worse doesn't mean that said things will stop getting worse if most people don't care or even noticing them getting worse to do something about it.
When you look back, you can notice patterns: Something new and amazing comes out and it promises to take the humanity to a new age. At first it is fuelled by enthusiasm where believers work for free on it just to make it great and pay their bills by working for money on the old-school stuff. Their prime motivation is not profit.
Then this thing starts becoming profitable in the sense of bringing street cred and money and new kind of people rush in with their new ideas how to use this new thing. These new people don't share the same ideas with the original believers but they know how to build machinery around it to make it profitable and appeal for the masses. The new thing that was supposed to change the world becomes a concentrated and optimised version of the things before it.
The believers then become bitter purists and try to fight the new order by disowning the current technologies or methods and cater for the niche hipster elitist circles when the rest continues do their thing.
You can see it in everything, you can see it in printing press you can see it in Radio, you can see it in TV, you can see it in things that are not media: cars, clothing, shaving, coffee - everything.
Things don't become worse, they just become mainstream and that mass adoption is run not by purists but by people with no regard to the original ideals of the technology and masses love it this way and stays this way until its made obsolete by something else.
The feature where the cpu fans run after a webpage displays a table with about 30 items in it is pretty swell. Others might cry w.t.f. and avoid using such bad software as much as possible.
The feature where Firefox repeatedly changed your preferences and helpfully showed PDF with the JavaScript jank was pretty terrible. Changing that preference a third time won't make it any more charming. Yes, yes, the cattle are supposed to be OK with the "movement" of their cheese, move along now, nothing to see here.
> ... clothing, shaving, coffee - everything
Uh, no. Coffee in America started out cheap and for the masses (following some sort of Tea Party, I think it was) and then even more for the masses (now with pre-ground beans, instead of using the mill in the stock of your Sharps Carbine) and only very recently has there been a movement towards not-mainstream "hipster elitist circle" coffee made by purists.
> Things don't become worse
This is not what I've read; for example, British church organ making went through a rough patch around the decade of 1900 or so. With a little study of history more such examples could doubtless be found. One might even be optimistic that the modern web might pull itself out of the "big miasma"[1] that it has sunk into. But if the powers that be are blind to criticism, and go on about "tooling issues" or whatever, eh, it might be a while before changes can be made for the better.
By the way, Arnold Toynbee said some pretty funny things about blind elites.
We aren't complaining about cool new features that the normies like though. We are complaining about the fact that some of us have gigabit per second bandwidth and still need multiple seconds to load and render a page. We are complaining about the fact that it is literally infeasible to build a web engine from scratch at this point, by anyone in the world. The problem isn't catering to every day people, the problem is praying on their ignorance and forcing them and us to tolerate a worse tool because they don't know any better. And when I say don't know any better, I'm not saying "they'd use Linux if they knew better" I'm saying that they legitimately don't know that over half (and I'm being generous) the data they download to render a news article is shit they're not trying to access but they're paying for, they think it just has to be that way for some nerdy technical reason they don't understand.
This is not about ideals, this is about foisting crap on the world that literally nobody wants or needs.
"original ideals" is a bit of a fantasy. Technology has always been used for both good and bad (and which is which is is subjective). It's human nature.
We always complain about bloat and slowness, then we build faster machines and networks and bigger storage, and software and content promptly expand to fill it all up and make it feel sluggish again.
For a chuckle try running old software on a modern computer (fire up dos box and run WordPerfect or something) and be amazed at the speed of the thing. that is why we built a faster computer :)
We've made machines so much faster that a lot of old software is unusable because its too fast.
One of my favorite examples: Lego Island's driving mechanic is tied to frame rate, so on a modern machine tapping left or right will fling your car 90+ degrees: https://www.youtube.com/watch?v=2CmqbccCqI0
That's the Turbo button on older Intel machines, that could slow your processor roughly to the speed of a 8086.
I discovered this playing the old DOS RPG Drakkhen, where there was a tower with a shark in its moat, circling around. It would jump and eat your characters if they happened to cross the bridge at the wrong moment. The funny thing is its speed was based on the processor clock (the game was released in 1989) and if you played the game with a faster processor, the shark would be too fast to be avoided. Push the turbo button, reduce clock speed, problem solved.
20 years from now there will be HN posts like "In my good old days we wrote efficient Electron apps that only used 2 GB of RAM to display a todo list, today's developers are lazy, there is no need to ask the AI to synthesize such a simple app, and certainly it shouldn't need a QPU (QuantumProcessorUnit) to run".
Back in my day, when kids heard older people talking, they listened because they knew they were hearing from someone with more experience than themselves. Kids these days lost their ways and think that knowledge and judgement are achieved in youth and then gradually fade away.
Just kidding, back in my day kids assumed the same thing.
Also, this exact conversation has been happening since the web was created. Linked article is evidence of that.
I agree with your commentary along the lines of o tempora o mores, when you could argue the pattern remains largely the same.
In a more practical way though, some new things are good, some new thing are bad. It makes sense to adopt good stuff and cut off the bad. Deck? Mostly good. Requiring phone number to play a game? Bad. Naturally, it is a very subjective process and we are bound to disagree on details.
Not that long ago a family member tried to use the same argument used here ( its a generational thing; old people just hate new stuff ) when trying to convince that Venmo is actually good as I was trying to gently indicate that maybe a payment system that by default announces to the world[1] I just spent X on Y may not be the best thing since sliced bread. Working near that space I was amused, but each to their own ( and I certainly am not going to tell the guy how to raise his kids ).
This is exact conversation (modulo the web) has likely been happening since humans became a thing. It's an endless cycle and it takes a lot of self-reflection to step out of.
No. It's not about anything old; it's about stripping things down to their essence.
A browser that can render text, and not much beside the text, emphasizes the text of a page, which, for many pages, is the content the user came for. Everything else is fluff.
In modern times, "Reader mode" in browsers does the same thing: removes fluff to make reading easier.
Of course it's great that now browsers support advanced features and enable amazing interactive pages like https://ciechanow.ski/internal-combustion-engine/, or just "simple" GMail. But such pages, and such highly interactive web applications, are fewer and further between than most pages that provide value by showing just static text and static images.
This strikes me as somewhat dismissive. My take is that the concerns raised in 1999 are even more true now. The problems we have with low-quality, bloated websites with questionable content are worse, and the rise of 'software that knows best' is only accelerating. If anything, this seems precient.
Also notable is this statement in the context of modern SaaS products:
"Who knows whether a new version will be available and usable when the current one stops working?"
With SaaS products the company can go out of business; they can change the product on you without your opt-in, making it unusable (or less usable) for your purposes and there is no way to downgrade to the previous version; they can stop supporting your web browser; your account can get disabled or compromised; new policies and regulations on data retention and privacy could render a particular SaaS product unusable for your purposes.
What are you meant to do if a SaaS tool you depend on suddenly stops working? How do you install the old version and get back to work?
I completely agree with the criticism of bloat but IMHO that is a problem because the tooling to achieve the "modern software" is bad, not because of the requirements for modern software.
In other words, the criticism is fair but the solution of attempting to freeze time or even try to go back is not right. Sometimes though, when things get very bad going back to the basics and re-do everything can work.
Young me (win95-2k era) had this feeling after researching into older software principles. I think that it is not [only] the effect of personal aging, but also piling up of a junk on top of an initial simple idea in any area. We just tend to notice that with years because everything straightforward-back-then becomes complicated, and straightforward-today flies under the radar.
But for browsers there is an obvious need for apps that are not “vb in an empty vba-enabled document”. This ugly heap is only stable because an enormous effort and skill goes into making it so.
Most people on HackerNews remember when they were the target demographic for most software. And they no longer are.
If you're under 40, this is how we all started. The target demographic since the explosion of the internet is no longer software engineers. That feels bad for people, and they lash out conservatively.
Because they're not wrong - the software IS worse. For them (us).
I could agree to it, if only I didn't know so many non-tech people who also think that tech started to go downhill at some point - and specifically complain about some of the same things, such as forced updates and dumbed-down UX that actually makes their life harder.
I don't remember ever being excited about disruptive software updates. If anything, I'm more patient at sitting through an app update on a weak cell signal than I was when I was younger.
I remember the days of waiting for the early adopters, reading the reviews and letting bug fixes and patches get released before making an informed decision to upgrade, if upgrading was considered worth it.
For things like security patches, pushing updates has been a net positive IMO.
For absolutely everything else, it's a disaster. I always used to think that the "free software purists" were a bit too radical for my tastes ... but now, in the era of SaaS, I find myself agreeing with them. I want to own and be in control of my hardware and software. Let me decide if upgrading is worth it.
You're right if you're using the web in order to display content. That was the case in the 90s. In this case, yes, a simple index.html with a <h1> and <p> is fast, responsive etc. But with webapps being more and more common one could argue that displaying text is not necessarily the web's main purpose anymore. If you're trying to access figma with a text based browser it's gonna crap the bed, so it fails the test, but is it a relevant test though ? The web is bloated but it didn't bloated just because engineers were bored, it had genuine use cases where doing more than just displaying text was needed. And it wasn't ONLY for marketing purposes (but it played a big part i'm sure)
There's also SEO as a simple reason that JS-heavy sites correlate inversely with content quality. Yes I know googlebot attempts a time/memory-bound render of a JS site to arrive at a DOM for text extraction, but this won't work with other search engines, and will never work as well and timely as providing static HTML to googlebot, no matter what.
I use noscript for this, plus I am running into sites that presents a screen stating "Cloudflare is checking". That Cloudflare check requires Javascript enabled. So I just move on, that to me means that site cannot even count me as a 'view'. Makes things a bit easier for me too :)
Wow. I wonder how you use stuff like instant messaging, music streaming and so forward. Do you also skip all browser based desktop apps because you hate Javascript so much?
No need to get upset at the guy. Its their choice to use Javascript in the way they want, rather than let any site that wants it use it. Most websites with articles don't need Javascript, and only use it to push ads/paywall articles. If there is a webapp that they want to use, they can always whitelist it.
As for music streaming and instant messaging, not everyone has/needs Discord, Slack, Teams, or anything like that. If you absolutely need it for work, just whitelist it. No problemo. And, not everyone streams music. I barely used Spotify myself until I got an office job and needed something to fill the boredom.
The issue isn't about stopping Javascript as a whole. The issue is about permissions management. Not every site needs it, so not every site should have it. If you need it to work for a webapp, just enable it.
I'm not upset, I am just curious and think the guy had quite the ignorant take on the whole subject matter.
I kind of disagree though, most websites do need javascript even if they simply contain only an article. Extremely few devs will care about the astonishing minority that disables javascript by default and saying that the websites that do this is trash, like so many people that favor this kind of thinking does, is being ignorant.
The javascript hate is strange because it's like people would hate programs written in python for no other reason than the program being written in python. I think it comes down to devs not being able to chose and that the only real choice is already made for them so then they end up hating it.
But even alternatives like Phoenix Liveview / Hotwire that minimizes the usage of javascript requires javascript to work properly. In time I believe most websites will ship a wasm binary and then this debate will be over forever and javascript will become irrelevant.
This is what Twitter does. It's possibly the least performant major website I've used.
Conditional loading certainly can improve perf. in theory. I've yet to see any evidence it does so in practice. The aggregate of bundle-size, bundle-parse, client-side execution resource-usage & added latency of the plethora of metadata normally bundled with API responses is more than enough to negate any actual perf. gains.
As for "easier to maintain", I've never seen anyone even try to make that argument in theory, nevermind practice. Pretty sure it's widely accepted even by advocates of this architecture that it's a trade-off of perf. gains for ease-of-maintenance losses.
Just because Twitter does that doesn't mean it is the case everywhere else.
It moves some of the rendering work from the backend (having to query the data and generate markup) to the browser (query the API and generate the content based on the responses).
At my current job, it's made a significant improvement. The server returns compact JSON data instead of HTML, so it's easier to generate the data and uses less bandwidth.
It also looks faster for the user, because they change search parameters and only part of the page changes, rather than reloading the entire page.
As for "easier to maintain", that may be subjective. Code to generate a simple HTML template from results is replaced by JavaScript code to hit the API and generate the DOM. Although HTML5 templates makes that much easier.
I'm not saying it's impossible - glad to hear you've successfully implemented it in your workplace. I'm just saying that by-and-large it has the opposite effect to the stated intent.
If most examples of a strategy make things worse, and only one person uses that strategy to improve things, then going around saying "everyone is doing it wrong" rather than questioning the strategy isn't particularly sound.
I've build plenty of (small) client-side rendered UIs myself that lazy-load content; I know the trade-offs and I even believe I can achieve a performant outcome on my own. But that's anecdotal. In the wild, I have not seen a single major website improve perf. via lazy-fetched content rendering.
I can't say i've seen a single site where in practice that worked as advertised. Also some times it introduces UX annoyances (e.g. back button not working as expected).
It is one of those things where in theory if absolutely everything was done right and no other stuff was done differently it can work. E.g. if the only difference between a JS-enabled and a JS-disabled version of the site was the content change and nothing else (no additional JS frameworks, functionality or whatever) then yes it most likely can be faster (though for the difference to be noticeable the site needs to be rather heavy in the first place).
Problem being that in practice this comes with a bunch of other baggage that not only throws the benefit out of the window but introduces a bunch of other issues as well.
Is there any evidence of this? All the sites I've seen that use extra requests to load text always seem to take multiple seconds to load. Whereas most pages that use server side rendering generally load under 100ms.
It's a tradeoff, basically the question is "Will most users need and read all the content or not". Displaying everything at once without making extra querries is best, but not always possible . The frontend is fetching the backend. So it's going to say "Hey, send me all the comments from all the posts from november 2021". If there are 3 it's fine, but if there are like 23,000 of them you can't really load everything at once , that's why we use pagination on the backend. We say "Hey send me results 1 to 25 of the comments from all the posts from November 2021" This way the frontend only displays 25 comments for a quick page load and we hope that it will be enough. To display the other comments either we ask the backend to let us know how many pages of 25 elements there are and we display that amount of pagination element links (pagination), or we simply tell the frontend to ask the next page once we reach the bottom (infinite scroll). Even if displaying all the content is possible, if there are content that only 1% of your users will read you might want to offer faster loading for 99% of users and add a few seconds of loading for the 1%.
>This way the frontend only displays 25 comments for a quick page load
Many years ago smart frameworks implemented smart stuff like you can display only what is visible. For example you could have a table with 1 million rows but in your html page you will not create 1 million row elements, you can create GUI widgets only for the visible part, as the user scrolls you can recycle existing widgets.
As a practical example , you go to a yotube channel page and they load only 2 or 3 rows of videos and you have to scroll to force more to appear, this means you can't do a Ctrl+F and seatrch and is also less efficient because as you scroll the items at the top are not recycled and reuse so probably more memory is used.
The json for all the videos is not huge,some strings with title and thumbnails, maybe some numbers but the issue is that is not possible to natively do the best/correct thing, only recently we got lazy loading for example so basicaly html was desibned for documents and frameworks/toolkits designed for apps did the correct thing many years ago... this is an explanation but no excuse why things are such a shit show with pagination today.
The argument is that JS-heavy site design indicates worthless content on average. Not that it's easier to maintain for the site owner (which might or might not be the case), or more realistically, creates job opportunities for "web developers".
Google Maps is one of the sites I still whitelist, but I often reconsider this decision, and I'm ready to find a replacement.
Today's Google Maps is a shadow of its original self, which did have a no-JS version, by the way. It has gradually gotten simultaneously heavier, less convenient, more annoying, and less useful, and I've just about had it.
Just off the top of my head, it no longer displays zip codes, takes a long time to load, has missing street names on the map, often promotes features I do not want while taking away features I do want, and is covered so thick with paid-promotion items that I can barely find somewhere to click that isn't an ad.
Sites that break with ad blockers are built to show ads. They only incidentally show any content, in order to lure users to see the ads. Users with ad blockers have negative value for such sites.
Sites that are built to show content but depend on ads to sustain the operation usually show a plea to support the site in a different way (by a donation or something) if they notice that ads are blocked. I find this more honest.
It might have third party trackers (Facebook, Google, Twitter, typekit.net whatever that is, etc). It might also have first party trackers of e.g. your mouse and keyboard that you’d rather disable.
Need some max-width to whatever the average screen was back in 1999.
True story, I re-uploaded a website that I wrote in 1999, only now I discover that my header was never centered, if was just floating left. It's the only thing that look off, everything else is working perfectly. HMTL/CSS/JS is really a stable stack for the computer field.
800 pixels was common screen width in 99. Even then it was not considered good practice design-wise to allow text to run the full width like it did in the mid 90s! No max-width, it had to be constrained with tables.
Yep agree it's a stable stack. I have a few archived sites I made in the 90s that work fine in today's browser, rollover JS buttons and all. My disdain for IE is visible in code. I was a Netscape guy for sure!
if(navigator.appVersion.indexOf("MSIE 3")== -1) {
imageObjectSupported = true; // this line only executes in browsers that
support Javascript 1.1 except of course for IE3, which thinks it supports
Javascript 1.1, but doesn't.
}
The Phoenix browser was the best I've ever used. From v0.1 to v0.5 it continually shrank in size, as improvements were just on speed, size, and the user experience. At v0.5 it was six-and-a-half megabytes. But by 0.6 it was getting bigger again. It already had enough of an HTML engine to render 95% of the web... but people just wanted more and more and more features, and refused to put them anywhere other than in the browser, defeating the point of the whole project.
It's interesting that the author would associate "frames, tables, and other fancy features" with a lack of interesting content. I assume tables are included because they were widely misused at the time, not because tabular data wasn't interesting.
I have a button in my toolbar called "dammit" which strips away every iframe, embed, object, audio, and video from the page, multiple times per second. These are considered foundational elements in the web platform and yet, when they disappear, pages seem to magically collapse into something readable.
Can you share how this works / code snippet? Would love to add this to my "Fk the modern internet" toolset. (I also use TamperMonkey which injects client js when url patterns are matched, to get rid of annoying cookie modals.)
Posting bookmarklets on the web tends to be frowned on by forum admins, so here's the raw code.
(function dammit() {
setTimeout(dammit, 100);
let els = document.querySelectorAll('object,embed,iframe,video');
for (let el of els) {
el.parentNode.removeChild(el);
}
})();
Feel free to make it into a bookmarklet using this guide:
Around 1996 or 1997, I saw netscape showing an small animated gif. Every cycle, the memory use of netscape went up by the size of the gif. On a machine with 4MB of RAM it did not take long to stop working.
I was at a startup from 2000-2003 where the chief scientist was a security wonk and demanded that our product worked with and without JavaScript enabled.
No wonder the startup failed. Imagine trying to make a useful product with one hand tied behind your back!
I just checked, Google and Amazon still work flawlessly without JS. And many other "useful products" too.
The question is, if a startup can't make a site that works without JS, are they actually focused on the product? Assuming it's not a webapp, of course.
You're making two completely different versions of your web product: one with a rich, modern experience using JavaScript and the other using vanilla get/post.
I've built several SaaS products and I can't imagine building a complicated product that supports both a JS and no JS version with a small, startup team.
It can go the other way, the CEO at one company I worked at demanded that we have a rich, dynamic user experience with real time editing AND we fully support all the browsers that had visited the site in the last 6 months. We had people using Blackberry, IE6, Opera, Konqueror and stuff you've never heard of.
They were happy to force javascript but wanted it to work on early smart phones.
The VP product was losing her mind fighting the CEO over this.
Bizarre. It's like starting a parking lot business by trying to accommodate every vehicle you see driving by your site over 6 months. You could have Winnebagos, tractor trailers and more bizarre vehicles, none of which you should consider as viable customers.
Well, maybe that parking lot example will help you in the future to explain why this is a dumb approach. Expanding the customer base has its own costs, so you obviously want to target some point of optimal return (revenue from customer - cost to support customer), otherwise you're just shooting yourself in the foot.
2000-2003 was a very different time, and people making javascript heavy sites then tended to be terrible. If I remember right gmail worked fine without javascript even after that time period.
With that attitude you likely were writing for IE5/6 only too. We're still dealing with the fallout from that 20 years later.
I worked for a company that had one user complain about the site requiring JavaScript, because he didn't "trust" us. Manager asked him why he trusted us to create an account on the site.
Its a great experience to open HN website which is not overloaded w/ javascript. And even greater to go to another website which is even more lightweight. This feels like a good and friendly internet
> Featurism is usually inverse proportional to content, and those who have content generally value being readable.
(as related to features such as HTML frames and tables)
Thanks for this quote attributed to Bernd Paysan I can now cite rather than formulating this over and over (for CSS grids, subgrids, columns, flexbox, functions, variables/custom properties and whatnot).
I can understand frames, but tables are useful even for plain text content and even lynx and emacs (eww, not w3, not sure if they are related though) have support for them.
And companies writing the software that average people use are motivated solely by "what the average Joe wants," where "what the average Joe wants" is defined by business metrics like "how much is Joe clicking" and "how much is Joe spending."
This is not a "bad" thing, but it's not what the Average Joe wants, it's more what the Average Joe is incited to do. More clicking (on ads directly, or on "engaging content" that shows more ads) is what the business owners want from an Average Joe.
But "people do it == people directly desire it" is a manifestly wrong metric. When something is in short supply in a store, people will line up to secure the chance to buy it. But Apple would be insane to think that people lining up to buy a newest Macbook on the day or release want lining up, and that more lining up is what they'd enjoy. People tolerate lining up to get what they desire. Equally, people tolerate more clicking in order to get what they desire, and what they'd likely prefer to obtain with one click.
> This is not a "bad" thing, but it's not what the Average Joe wants, it's more what the Average Joe is incited to do. More clicking (on ads directly, or on "engaging content" that shows more ads) is what the business owners want from an Average Joe.
This is a bit of a philosophical question.
Are there things we want that weren't somehow influenced by society (e.g. family, peers, advertisements, culture, etc.)? I'd argue very few things, aside from the basic biological needs. By corollary, almost everything we want is the result of external influence.
> But "people do it == people directly desire it" is a manifestly wrong metric. When something is in short supply in a store, people will line up to secure the chance to buy it. But Apple would be insane to think that people lining up to buy a newest Macbook on the day or release want lining up, and that more lining up is what they'd enjoy. People tolerate lining up to get what they desire. Equally, people tolerate more clicking in order to get what they desire, and what they'd likely prefer to obtain with one click.
If you measured line up times at Apple stores, you would almost certainly find out that its effect (of longer line ups) is a decrease in revenue (and therefore not something people want). If your point was that metrics can be misinterpreted and misused, I absolutely agree with you.
My question was specifically about the revenue/engagement metrics commonly used for software. To me they seem like reasonable proxies that you are building software people want.
Maybe he clicked 5x more because he wanted even more information.... or maybe the basic info that he was looking for is now hidden in a very shitty location that was intentionally made hard to find in order to generate more clicks and the user was super mad the entire time. Either way, the manager gets a bonus for extra engagement that quarter.
Went to McDonald for the first time in a year and used the touchscreen to order. I had to dismiss over 7 popups, including a few that look like they were internally trying to get drunk people to accidentally order more (eg. cancel bottom from the previous popup is aligned with an extra order on the next one). It made the process a lot more panful than the minimalist interface they had before, but I'm sure a group of people got praised for it internally.
You ask them. And when they tell you that your product sucks, you listen, and don't tell them that they're holding it wrong, or that they just need to give it some time and they'll love it, or that everybody else loves it so clearly they aren't average, or that telemetry shows otherwise etc.
Well, if you look at i.e. Meta, which specifically optimised for getting people to 'click' more, it is clear that they did so by building a product with disastrous impacts on mental health.
So there is at least one example where it isn't a good thing.
You can build something people want and which is bad for health, they're not mutually exclusive.
People commonly want things that are linked to negative health outcomes: alcohol, sugar, fast food, lack of physical activity, working a high stress job, living in a city, watching the news, etc.
Personally, I take the position that people best know what's good for them because I don't see good alternatives to that.
I don't assume it can be measured, and I think large organizations are stupid and morally shallow enough without letting them loose with the mandate to maximize a handful of simple metrics.
I'm inclined to make the comment that the "average Joe" doesn't actually know what he wants, in respect to what's actually good for him in the long term.
Oh my, the mention of DJ Delorie brought me back memories: That's the famous DJGPP C/C++ compiler which along with Allegro library was the bomb to develop MSDOS games back in the day!
Hey, thanks for this! I looked online for the release date of the emacs extension he's using, which dates back to 1997. But he mentioned it being quite old so I didn't know what to put as the (year).
> Die Featuritis ist meist umgekehrt proportional zum Inhalt, und wer Inhalt hat, legt in der Regel Wert darauf, auch gelesen werden zu koennen.
(Featurism is usually inverse proportional to content, and those who have content generally value being readable)
Too long for a tattoo but damn do I believe in this!
Unrelated to this content (I think), tuwien.ac.at is a domain I have not seen nor thought of in decades. I see that there’s new stuff there too, but does anyone recall why this might have been a well-known domain in the 90s? I feel like I used to regularly read some content or download software from there, probably UNIX stuff.
I had the _exact_ same feeling; when I read your comment I started googling a bit but I came up short. In my case it should probably have been one or some of: MUDs, Linux, Debian, Amiga, shareware, usenet, IRC.
I recognised it because I got into Forth in the last 5-10 years, and the author Anton Ertl was with Bernd Paysan the creator of GForth in the early 90s.
Otherwise, see here for his CV, interests, links which may trigger your memory
Yeah, getting the most use out of your paper (or picking said size in the first place) was a time-honored tradition of print typography. And sure, not 1:1 applicable to the screen.
I do think that the screen typography department has been seriously lacking here, though. Scrolling a single column can't really be the end of wisdom.
But the main problem we've got is that the ad people took care of on-screen typography once we progress far enough that screen sizes and resolutions actually would've made more things possible. Not the tradition of typography that made newspapers, books etc., but the flyer and full-page-ad demographic. PageMaker, not FrameMaker.
Browsers barely have the tools to make reading effective and enjoyable. CSS is oriented towards other purposes, and the browsers themselves only have the bare necessities – practically hidden "user stylesheets" or the one-size-fits-all "reader mode".
My Postscript is somewhat rusty but I think defining a fixed page size is THE first step in any postscript file. I don't think that's really better than PDF in that regard.
For anyone who was actually around at that time you need to remember Windows was Windows 95 and that meant a badly behaved application generally required a reboot.
From what I remember of that time, Windows Explorer seemed a little faster, but most importantly it also seemed to require fewer reboots.
That's not my memory of it. Windows 95 introduced the Task Manager, which obviated some of that. Maybe I was lucky in having well-behaved applications.
I was recalling my time writing Windows applications using C and Win16/Win32. That combination made it easy to write a badly behaved application; a stray C pointer here, an uninitialized handle there and before you know it, you've created a badly behaved Windows application. That then could easily result in a Windows to crash or lockup which then would require a reboot.
Developing on a modern OS is so much easier only because the OS can protect itself from these badly written applications.
Not Mosaic but worth browsing : the capsules in the Geminiverse ( https://gemini.circumlunar.space/ ) have no modern browser gimmick at all and a very good signal-to-noise ration. #my2cents
Imagine a world where it's common to cd into a directory and anywhere from 1-10 awk scripts start running and possibly your image viewer and media player. Would you think you had malware installed?
> those who have content generally value being readable
I find it ironic that his website looks like hot garbage on a modern ultrawide display.
I realize that it's over 20 years old and it still looks bad at 1280x1024 which was the resolution I was using back then as a poor college student with a second hand 19" Sony Trinitron that had a dodgy VGA cable you had to hold up just right with a coat hanger.
I would argue that it's the job of the web browser to provide a default stylesheet such that a basic webpage with a header and a bunch of paragraphs looks "right". Which includes defining sensible viewport size.
That's a completely reasonable expectation and they never have to my knowledge.
So the author, having lived through the computing sea change of the 80s and 90s, who said "those who have content generally value being readable" should have anticipated higher resolution displays and took steps to ensure readability.
How do you know someone has cognitive impairment which makes it difficult to use JS-heavy sites and/or is using an older device which has performance issues with heavy sites?
Answer: Don't worry, they just won't be able to use your site.
The same could be said about the use of images on a site.
The solution is to develop sites with accessibility in mind. Use a screen reader; experience what ALL your users will experience. Experiment with various rendering tools to emulate color blindness.
There are even more tools to performance tune your website.
This all comes down to the site author taking the time to cater to as many users as possible. It is not inherently a problem with the use of JavaScript, or dynamic elements.
So, yeah, there are a lot of shitty websites out there. Folks that choose to deliberately cripple their browser are more likely to see these shortcomings.
But if we're talking about text-based websites that are basically brochures, all you have to do it fill in the alt attribute. Nobody is asking anyone to make their website of paintings cater to the blind, it's a strawman position.
<<The solution is to develop sites with accessibility in mind.
Hmm. No.
I think other poster wrote something to the effect of 'once something becomes popular and adopted by the masses, it ceases to be the ideal believers once strived for'.
If anything, it would appear that when you attempt to please everyone, you have to trade-off in places that some users may find unacceptable.
"If these browsers don't display anything, or the display looks shitty, there usually is not much content"
Crazy wild assumptions given that the web was brand-spanking-new and changing rapidly and unpredictably by the day. I'll never understand these needlessly-minimalist perspectives on interacting with the internet (yeah, I'm talking to you no-JavaScript folks). That approach only result in you missing out on things, and that was even more true in the 90's.
It's easy to forget that 1999 was still firmly in the dial-up modem days. This was as much about practicality as it was about principles: every second spent loading a website would very literally cost you money, so any policy that would let you cut the process of finding worth-while content short, in an even remotely efficient manner, was a sensible thing to do.
Today, almost 25 years later, not even turning your phone or computer on will cost you just as much in ISP monthlies as it costs to load the heaviest, client-side-JS-generated, 4x resolution image websites nonstop all day every day.
We used to have a slightly better reason to prefer lean, content-first web pages than we do today.
I did find this bit interesting too...
> Apart from this practical reason, there's a principal one: The first time I invoked Netscape, it said that it is obsolete and refuses to work. I don't use software that thinks it knows better than I when I should stop using it.
...considering the modern trend for autoupdating software. The author (after this paragraph) also considers availability, but another issue is if the software is something one would like to use even if it is available - for instance, personally i never liked using a version of Paint Shop Pro after version 7 since i found all of them a degradation. I can use PSP7 just fine though (even on Linux via Wine) - imagine if the software decided by itself that it is too old to run or to replace itself with a new version against my wishes (this is something a lot of software does nowadays).
From a user's perspective this also has implications on preserving backwards compatibility for foundational functionality programs rely on.
[0] https://people.inf.ethz.ch/wirth/Articles/LeanSoftware.pdf