Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why is everything changing too fast?
513 points by dmje on Dec 6, 2021 | hide | past | favorite | 381 comments
I'm getting on (I'm nearly 50) - not a software dev (thank god) but more a project manager. I do a lot of the "knitting together" type work between developers, UX people, designers, content owners, etc.

Until recently, we used to do things like write cheatsheets and other help docs for our clients for tools like Google Analytics. This was all fine, and they were appreciated, as clients just don't know how to use these tools.

But recently, the rate of change has just made this untenable. I'd log into a tool like GA and the whole thing would be different. Not just the upgrade to 4, but then incremental changes there, too. So cheatsheets, training workshops, anything around support - just becomes untenable.

Another example: I log into Teamwork (my project management tool of choice) - and they're "retiring" the plan I've been on (and very happy with) for years. Instead I have to choose "Growth" and now my dashboard is littered with a whole bunch of stuff I neither want or need. Nothing is where I'm used to it being.

And: we do a bunch of work with Wordpress. The rate of change here is insane, too - every single update brings new features, none of which is documented, bedded in or understood. None of which can be written about, supported or workshopped.

And: Trello. It was fine. And then Atlassian bought it and it became this horrific behemoth of "features", all of which just clutter everything up, none of which seems to actually do anything useful.

And on, and on.

Is this rate of change supportable? Am I just too old? Help me put this in context, HN!




I’m your age and of similar profession with a known history of doing software development. Having been a nerd since very early age I come to the conclusion that

0) I am growing older and I may get more conservative 1) I used a lot of products that evolved to a local optimum but I see a lot of them being thrown back into a evolutionary state they already passed a decade ago. Maybe to eveolve better, but I have my doubts. 2) Everything is bloating now. Instead of a collection of good tools interacting I have now 3+ ways of opening an Excel-File someone shared in Teams. All of them are broken and Teams is broken, too.

I feel as if the excellent wrenches I have been using for 20+ years are growing tumors in the form of a can opener. All to please people who never used a wrench or a can opener before. And production of the original wrenches is cancelled. Over night.

Next time I need a wrench it may be made of felt, because fedoras are en vogue and the mad hatter has to sink venture capital into expanding his business.

(Funfact: In German you could make up the very valid and understandable, but still strange word “dosenöffnerförmige Tumore” )


>And production of the original wrenches is cancelled. Over night.

Even worse, the company has broken into your shop overnight and replaced the wrenches with these pieces of crap.


One good thing about the old (?) model of installing software on your pc rather than subscribing to software was you could just keep using the version you liked as long as you liked. You might eventually face an upgrade that offered a bug fix or a new feature you really wanted, but the hard choice was in your hands.

If you wanted to move slowly (or not at all), at least you had the option.

There's still people writing books on WordStar. People still running small businesses on 1990s era Peachtree Accounting on DOS. Maybe they'd benefit from new software, but honestly probably not all that much from their viewpoint.


Yeah, this is one of the reasons I use emacs and have completely ignored atom, vscode, etc - I’m pretty sure that my customizations and most of what I learn now will still be applicable in 10 years.


While the emacs package landscape is also changing at a rapid pace, this is also one reason of why I'm more and more getting into vim, emacs and Linux command line tools: They have been around for a long time and will be around for much longer, most likely only getting incremental changes.

I guess this is another great feature of community-maintained open source tools vs commercial software: A slower rate of change because nobody has to prove anything to anyone by adding tons of features and redesigning the UI.


That's a reason not to use a closed source IDE like JetBrains, but where VSCode is open source, as is Atom, it's a bit less applicable.


If your don't want to, you don't have to upgrade your Jetbrains IDEs, so that seems more in line with the old "upgrade when you want to" mode. Except for licencing, of course.

Though specifically with Jetbrains, I fine that their upgrades almost always either make things better or don't get in my way.


Eh I really don't want to have to fight against the happy path on VSCode, even if I could maintain my own fork, or whatever, and try to hold it in stasis. Emacs core is pretty fully baked at this point, and it just works on basically everything I throw at it.


Yeah but as soon as you work with other people who use newer versions of the software then either you need to get them to export in an old format each time, which they will find annoying, or you need to update the software just so you can open the files that they send you.


>Even worse, the company has broken into your shop overnight and replaced the wrenches with these pieces of crap.

Don't forget that your brand new (felt) wrenches are imperial[1], while your old ones (and all the nuts and bolts you ever dealt with) were metric, due to an error in the shipping database.

[1] not that it makes much difference, since they are made of felt


The fundamental truth of software is that it is much easier to write than it is to read.

So it's often better, from the developer's perspective, to reimplemement a solution than to maintain refactor an existing legacy solution, even if your re-implementation ends up being inferior (which it almost always will at the beginning).

So the reason why software cycles is that pioneers roll out some new tech, it gathers momentum, over time it becomes complex, it becomes harder for new developers to understand it, until a breaking point is reached and the new guys splinter off to make their own version from scratch, repeating the cycle.

Yes, this is extremely wasteful, and it's hell on end-users, but there is forward progress, even if it's of a two-steps forward one-step back variety.

Frankly, I don't see this situation changing as long as developers are sought after, which they pretty much always will be. If decision making is handed over to the sales and marketing folks, they will make even worse mistakes.

It would be nice if there was a "stable software" movement similar to a "free software" movement. One that championed backwards compatibility and preservation of APIs and the user experience above all. Then customers would at least have a choice as to whether to get a dynamic or stable library/application/service.

Apropos: https://www.joelonsoftware.com/2000/04/06/things-you-should-...


You don't get promoted for spending a lot of time deeply understanding what was there already, and only making minor changes!


Exactly. So many times making your own framework/language/library is how you get promoted.


My first instinct is always to imagine that writing something new will be easier than reading what's already there. The instinct is often wrong, though. Plus, I'm much more likely to learn a new trick by reading than by writing.


On (1), One thing I've noticed happening a lot only these days is that PMs seem to want to add further improvements to something (to beef up their resume?) But that thing was already working great and the activity ends up making it worse.

Reddit, Google chat products, GitHub ux rewrite for the first few weeks, YouTube

Product schools need to instill this idea that some things _may_ already be perfect the way they are.


> Product schools

This isn't really a thing. Oh there are few programs out there and there are some great private workshops, but there isn't degree you can get. There both are Design Schools, and Business/Management Entrepreneur programs that cover some aspects of this. But most PMs get that job from some other angle.

And even if they did cover this ... it directly counteracts their job responsibility. Yes, you should move me to another project or fire my, this project should be considered dead. How many people are going to do that? Within certain companies and cultures with many projects and growth, one probably could, because you have a clear path forward. Or you see the writing on the wall and jump to another, higher growth company.

The real problem isn't on the PM side, it is on the business leader / GM / CEO side of things. But even then, there is some other stakeholder asking them to expand them TAM and grow in some other direction that dictates some other feature coverage. This is a really hard trap to avoid.


> but there isn't degree you can get

In Europe you can get a bachelor of arts in product management: https://code.berlin/en/study/product-management/


I think this is a symptom of PMs (and devs) being hyper-focused on the pain-points of their customers. They launched The Product a year ago, but ever since then users have complained about Problem X. The fact that The Product is amazing and creating real human value, is kind of irrelevant, both to the company and to the users who are suffering Problem X. Those users are raising hell. Meanwhile, tech corporate culture holds as a basic tenet that if your product is done, it's dead. But solving Problem X requires re-thinking the entire UI, so off we go.


Most of github's changes as of late have been improvements imo


Because they weren't done. Not all change is bad. But for a "mature product," one needs to carefully consider the cost of bloat. Modular/plugin type systems (done well) can be one way to avoid this, since they allow users to select their own feature bundle. But you really need to manage the abstractions and interfaces carefully to not add more friction or user confusion.


The first few weeks after the redesign had a lot of bad things that were fixed because people started raising issues.

It didn't show the last commit message, for example.


If you don't keep developers working on a product you'll find a few years later that nobody has a clue how to support it and then the rewrites start.


if you don't change something you are not "innovating". this industry runs on "innovation"


When they add another crap To your favorite app That's tumoreeee

(Sing to the tune of https://m.youtube.com/watch?v=OnFlx2Lnr9Q)


*When they add one more crap // Onto your favorite app // That's tumoreeee


Teams is broken


Oh so very broken. I wanted to do a quick meeting with a client, being used to do quick five minute video call using Google Chat + Meet I figured Teams would be able to do the same, but nope. Well you can, if you can find people, but in the end it’s quicker to just schedule a meeting and srnd them the URL to join.

Teams is basically junk, made from the bits of Skype no one else wanted and things that feel off Chrome.


Seems like every time I launch it, my virtual backgrounds are gone. I thought maybe I wasn't logged into teams, but no, I am. Now my only option is blur, if I can even find where they hid that!


My "favourite" Teams bug is just swallowing input in chat [1]. I mean c'mon, a networked chat app is basically a sophomore year CS assignment, and not demanding at that. Swallowing/droping input would be enough to fail the grade. But somehow, one of the richest companies in the world, own by a business mogul that's also a symbol of the tech industry, cannot get it right. That's mind boggling.

[1] Sometimes, when I open a new chat and quickly start typing, after 2-3 seconds the whole window suddenly redraws and whatever I've managed to type in so far is no longer there...


Even the most basic flows of Teams are broken.

If I open a call link in my browser it shows an option to open the desktop app.

Most of the times the Teams Desktop App (Electron) opens with a blank screen and starts to restart itself. After several iterations it will log out completely and ask me to login again. If the login is expired it should already at the first try.

I grew so tired of it that unless I need to share my screen I just use teams via browser.


Broken implies that at some point it was working.


Teams is broken, Microsoft is breaking everything

No Microsoft, I don’t want to openly the ppt in the browser or teams where it’s barely functional, graphs cannot be edited well, and everything is displayed differently than on the app.

No Microsoft I don’t want to save on your proprietary cloud, I have never done I so why must I cloud through three panes to get to the original file browser in order to save it elsewhere?!

No Microsoft, this is my work computer I don’t want news articles and web searches popping up in the task bar and notification area


Having been forced to use Teams by my employer since its inception I'm going to say that it has never worked properly since first release.


I always run Taskkill /IM Teams.exe /F command on my machine when I switch organizations in Teams and back. Without killing the Teams task I will not get any updates.


it is literally a hack. born out of a hackathon


Teams has been a phenomenally successful product. It might not work great in all dimensions but overall it delivers in so many ways.

To talk of it as broken implies it is there to do one thing (which it doesn’t) however it’s role is multifaceted and in totality it delivers.


> Teams has been a phenomenally successful product

I'd bet to a large extent because it's "free" if you are already in the O365 ecosystem.


This comment is gold. Wrench tumors!


Bloatware is devouring the world!


Truer words have not been spoken.


I think there at least 3 things at play here:

#1) Software used to live on a disk you buy. That software doesn't change, until buy a new disk. Then it lived in apps, where you can update or not. Now often, it lives in the mythical cloud where changes can happen all the time.

Now, even on the disk front, things were always changing. Often for the better, sometimes for the worse. In the OS world Dos->Windows->Windows 95 were big changes, and the OS9->OSX change also huge! But now the changes are always.

#2) The entire software world is built on VC money. VC money is not looking for slow and sustainable growth. Or a happy userbase of 10k people. The VC world doesn't mind if 99 companies crash and burn trying to harvest the wind while building their sails if one takes off.

#3) Out of the VC/startup world, large companies must justify their existence and every team and every programmer on that team must justify theirs. No one ever stuck around by saying "everything is good, we literally don't need to do anything or acquire another customer, let's all cut out hours to 2 days a week, keep patching bugs and making security updates, be happy with our current level of subscriptions, and spend a bit of time in R&D to make sure we have other bait in the water too if for some reason our users stop liking this.


> No one ever stuck around by saying "everything is good, [...] spend a bit of time in R&D to make sure we have other bait"

That's more or less what FogCreek did with Trello and CoPilot. While FogBugz was "good enough" to more or less run on rails, they made other bets to expand or productize some of their internal toolkit. It worked so well that these side-bets eventually got much bigger than FogBugz.

I wish more people tried that, instead of messing with their cash-cows. Yes, sometimes you have to keep up with competition or "innovate" in your own space, but if you got big already, chances are that you've done your best work already - leave it as it is and move on.


This has long confused me about energy companies. You're big in coal, or gas, or whatever, and see green energy as a competitive threat. Why not take some of those massive profits and treat it as lengthy and luxurious runway to leapfrog into those new markets instead? There's a defeatism there that I wouldn't have anticipated based on how confident and aggressive those companies have historically been.


It is really a shame that there is not a carbon dioxide tax and have it pay out for sequestering carbon dioxide (negative emissions). Sequestering CO2 in geologic formations would fit right in with oil and gas company's core competencies. Many of those companies supported a carbon dioxide tax and they could have transitioned to companies just sequestering carbon dioxide sometime in the future. This could still be done right now, but current politics is more interested in punishing the other side than finding win/win solutions.


You say 'this could still be done right now', but in many cases it is being done already. For example the company I work for [0] spent many years databasing where oil reservoirs were globally. Now we sell the same information for sequestering CO2 in the empty reservoirs :-)

[0] https://www.cgg.com/industry-applications/energy-transition/...


That's cool. Looks like quite a big international operation. Who are some of your customers and places where they are sequestering CO2? I had not heard it being done at commercial scale yet and don't see any examples on the website.


Some are, but it may not be obvious because instead of doing this directly within their company they are becoming venture capital funds for different energy markets.

For example Shell Ventures [0] funds companies like d.light (solar home energy) and Husk (off grid renewable energy)

[0] https://www.shell.com/energy-and-innovation/new-energies/she...


That's basically Shell's strategy.

So now you're Exxon, and you see that Shell has significant investments into oil AND Green energy (specifically Wind right now for Shell). How do you plan to beat Shell at their dual-strategy game?

Exxon double-down on oil. You can't just do the same strategy as someone else and expect to get ahead. You gotta place your bets on a different strategy.


> You can't just do the same strategy as someone else and expect to get ahead. You gotta place your bets on a different strategy.

No, you don't. Betting all on black is just a different gamble, and in my opinion a more risky one. But investors are ok with it because many prefer pure play gambles.

There's no law of nature saying that you have to zig when others zag. A big opportunity like clean energy has room for many winners.


It's a bit different in that space because the "new markets" are seen as a thread that will cannibalize the core business. It becomes a major political issue within the organization.

Not to mention the reputation of the organization might make it difficult to attract and retain the types of innovators and professionals it needs to succeed.

Whereas with FogBugz, they were investing in complementary products, not destroying their own market.


There's a popular book called The Innovator's Dilemma that assesses the phenomenon of incumbents being out-competed by upstarts.

https://en.wikipedia.org/wiki/The_Innovator's_Dilemma


Every energy company that I know of makes substantial R&D investments in an effort to patent the most likely technology paths forward. They all hope to develop a portfolio of patents that will allow them to sue any new competitor that becomes a threat.


Solo devs who want to make a few bucks should look into developing desktop applications. VC companies can't compete in that space, it's not profitable enough.

Only problem is we don't have any good UI toolkits anymore, all are second rate compared to the browser. Qt might be the best, but their licensing scares solo devs.


I want to make a living making desktop applications. However, as far as I can tell, unless you hit the lottery, it's not doable in general, because of a growing contingent of users who won't use your tool unless it's free or open-source (there's a bunch of people running around the internet, many on HN, who think that the existence of proprietary software is intrinsically evil), and an increasing number of devs who will make an open-source clone of your tool simply because yours isn't open-source.

With that, if the only thing that's going to happen is my ideas get taken and used in an open-source project, why not just keep my tools to myself and continue to work for a big corporation?


Three years ago I started a proof of concept electron app to design and print labels on thermal printers. That app has grown into mid-6-figures revenue from mostly “one-time license” sales. I do offer monthly subscription pricing with an expiring license and after a year my MRR has hit $3k. My competition is windows only and closed-source, largely backed by private equity.

I live in a small town in Minnesota so I don’t need to make much money to be comfortable, raising a family. I’m in my late 30s and have done mostly iOS and Node IIOT consulting the last 10 years. That’s what was paying the bills until my electron app started making “real” money.

On the technical side. I get emails every day saying how much a user loves my app. It’s really rewarding. Every year I get one angry email or App Store review saying that there are much better design/desktop-publishing apps out there. These reviews are usually written by technically savvy people that don’t quite get how stupid-complicated label printing can get. Like, don’t expect to design your company logo in my app. But surely, you can drag in your company logo and my app will dither your image for a 200 DPI zebra printer.

My app is basically a no-code image processing pipeline that effortlessly hooks up spreadsheet data, generates barcodes, resizes and fences text, dithers images, dynamically changes colors, and then talks directly to a thermal printer without using a driver.

And you can use it for free for 14 days to see if it works. It’s a single download. Yeah, it’s electron, yeah it uses some memory. But you can share the design files between Mac and windows and it doesn’t require internet for use.

I don’t know. I think it’s a pretty good solution.

Read more at https://label.live


Very interesting! I'm always amazed at kind of creative work I come across on here. Did not expect to read a success story about generating imagery for thermal printers this morning, but I'm glad I did.

Btw, what do you mean "without a driver". Do you bundle code that talks to thermal printers in you app? Is there a standardized communication protocol in this space or do you maintain a breadth of custom implementations?


Also worth noting they in my journey I’ve added the ability to submit the label jobs as PDFs to existing printer drivers. This allows for targeting full color label printers like epson and Primera, but also inkjet and laser printers that use sheets of labels. This is accomplished by rendering the labels on a larger page and then sending the resulting PDF to the system printer. There’s also the option of sending the images to a directory on the file system and naming the files using a column from the spreadsheet. Quite versatile … but difficult to market.


Yes, the app uses node-usb to read and write directly with printers matching vids/pids. Each vendor is a custom implementation with some similarity and lots of differences. It usually involves parsing status to get DPI, busy states, then sending raster image data, and other commands to place the image, feed speed, heat settings, cutter, etc.


>it's not doable in general, because of a growing contingent of users who won't use your tool unless it's free or open-source

Oh, don't worry about that. Trading money for things is an ages-old thing and will remain for centuries no matter what. Just make something useful and put the right price on it. People will buy it if it solves their problem. If someone won't use it because it's not free - they don't really have the need that your product solves. Just ignore them.


This is not as bad on the Mac platform as it is on Windows.

Anyway if a few devs can quickly slap together a functional open-source clone of your apps you might want to ask yourself if your app really provides enough worth to your users to pay for it.


I've been making desktop apps for two decades (proprietary closed source software).

My target audience is businesses, not HN users so I haven't ran into these open source zealots.

Some businesses do ask for open source (politely declined)

Some have also asked to place the source code into escrow for continuity in case the business folds (also politely declined but for some companies this a reasonable ask)

The thing is that people will ask for literally anything and the requests are asinine:

* Rewrite in java\python\ruby\c++\whatever language the IT guy read about last week (nah)

* We need a mac version for the one guy who somehow uses a mac (nope)

* Support oracle (fuck no)

* Support novel networking (lol)

* Make a webapp (lmao)

I'm also not afraid the "hacker" who is going to re-write my app. If theirs is truly better, I will just fork it and crush them :)

Don't let the bobble heads dissuade you. If you want to make a desktop app and if you think the desktop is the correct medium, do it.


Username checks out.

Loved VB6. I have a friend who is still writing and maintaining software he wrote in VB6 for an industrial plant to this day.


Curious, how would this solo dev convince anyone to _download and install_ something on their machine?


The fact that you are even asking this makes me feel so old :-)

Anyway, you'd be surprised. Once you venture outside the HN echo chamber there are large numbers of people who really prefer that their data stay on their own PC. Also, many are concerned about the same problem that this topic is about: having interfaces change without warning is a huge problem for most people.

Convincing them is largely about providing an upgrade path if they need it and not having forced upgrades.


> Once you venture outside the HN echo chamber there are large numbers of people who really prefer that their data stay on their own PC.

AFAICT, that's backward: people with that preference are over-, not under-, represented on HN.


No, I think parent is right. One gets the impression that HN only meets two kinds of computer user: their parents who just use a web browser[0] and tech company or large enterprise employees. There are a lot of people out there relying on highly specialized VB 6 applications to run their business or engage with their entirely-too-serious hobbies.

[0] anyone who just uses a web browser arguably doesn't even count as a 'desktop computer user'.


You're right. I really didn't say that well. What I was trying to get at is that there's a preference for webapps vs. PC apps on HN. Which is strangely at odds with the privacy obsession you find here.


Not sure why you got the impression that I haven't used software in the 2000s.

My understanding right now is that if you have a software that people need to install and your competitors have an in browser offering, your customers overwhelmingly choose the in browser offering.

This has been my experience even though I hate the subscription model as much as everybody else in this thread


Indie game studios have managed it. They're a response to the current era of bloated AAA titles that discourage single-player modes and nickel and dime the user with frequent microtransactions to keep the recurring revenue coming.

Speaking for myself, I have replaced Adobe CC with Affinity Publisher, Designer and Photo. A one-time fee for software I can use forever, as opposed to paying forever regardless of whether I use it or not.

I also have Dreamweaver and Flash-type animation tools (Bootstrap Studio and Tumult Hype) that are also one-time fee apps.


You aren't the only one. A lot of people and smaller shops have abandoned Adobe for this reason. A single, upfront cost is just a more justifiable expense than an upfront cost with a monthly subscription, not to mention, losing access to your work if you miss a payment.


Mark it as "pay once use forever" instead of the infinite cost sink of a subscription model.


Change the model to "pay the subscription fee otherwise we will alter it"


And pray we don't alter [the deal] any further


And more and more, "it will never change, no new features" can be a selling point.


Sure, unfortunately as a solo developer unless your app is a unicorn or you make several of these apps, pay once use forever doesn't pay your bills a year from now.


> or you make several of these apps

Yes, this.

Investing all your energy into features none of your users want for an existing app is a waste of energy for you and an annoyance for the users.

And maintaining the backend of a SaaS can mean more operations busywork for you, compared to a standalone application.

Build something new people actually want.


I'm afraid desktop apps mean a lot of annoying bug/crash reports from users with idiosyncratic local configs (e.g. antivirus software, or they modified some windows registry key a long time ago, because a YouTube video told them to do so, and now don't even remember it etc.).


They would publish it in a trusted App Store.


Do desktop users use these app stores? I'm yet to install a single app from the Mac app Store


My anecdotal evidence says yes. My wife was confused on how to "install an app" for something that wasn't in the Mac app store. She had been used to the iPhone/iPad model for so long that she had forgotten how it was on her Windows PC 10 years ago: if not in app store -> google company/go to company website, download the installer, install program.

I feel like general computer skills have decayed quite a bit in the general public, especially with people who don't work with computers 8 hours/day. See this recent article: https://www.theverge.com/22684730/students-file-folder-direc... The advent of the smartphone/tablet overtaking PC use for general/casual use, e.g. emails, photos, social media, online banking/billpay has really reduced the general public's skill level with desktop operating systems. Probably similar overlap with the subset of people who always dumped every file to the desktop directory instead of organizing files in specific directories.


Give them a choice between a web app, a desktop app (which can be on Steam or the Windows Store too), or mobile apps.


Do you not do this?


No unless you're counting browser, ide and everything CLI I install using brew


Interesting. I have the Adobe Suite, Ableton, a zillion audio plug-ins, ProTools, and the Unreal editor, not to mention DropBox, Outlook, Teams, Slack, Bear, excel … This is just from memory without looking at my apps folder.

Even for IDE I have XCode, Webstorm, Sublime …

In general, native apps are still far superior for most tasks I do.


What if this something did not need installation? Just download and run.


If it fulfills a business need it's a lot easier.


Unless you’re doing Linux applications you have all the toolkits you need. WinUI for Windows and Cocoa for the Mac, done. They’re not cross platform, they might not be all they could be, but they are the default and they work.

I believe you’re right in that there are money to be made in developing desktop app, but I also believe that the customer base expect thing to be native, so don’t try to do Windows application using GTK or macOS applications using wxWidgets.


>They’re not cross platform

Which is why everyone and their dog is going Electron. Linux compatibility is one thing (which no one really cares about, except for software in the public services sector), but users generally expect that their software is available on a Mac - not to mention that people using a Mac tend to be on the better-earning/willing to pay actual money for software side.


I disagree. I think the troubles of cross-platform GUI are overstated. People used to port software to a dozen different 8-bit micros and 5 game consoles using assembly, writing for Win32 and Cocoa can't be as difficult as that. Besides, it will incentivise you to make your interfaces simpler, while still providing you the ability to leverage platform specific features.

I think everyone is moving to electron because web developers are a dime a dozen.


True, but beyond VS Code name one Electron application that’s actually good.

Companies want their application on two platform, use Electron and now their product suck on both.

Electron is actually a reasonable solution for internal and bespoke software. It just not at a level where I can see it being something I’d want to pay for as a consumer. Again VS Code is pretty good, but it’s still clearly not a mac app.


Discord and OBS come to mind.


Discord certainly works, but when I see a chat box lag on an i7 8700k with 32GB of memory I die a little inside.

And that happens occasionally with Discord, without running make -j42.


Discord is awful to use a desktop application. It's sluggish and cumbersome, not snappy and lightweight like a native application would be.


I'm fairly sure that OBS is writing in C++ using QT.



My dog is actually doing Mac only, since that’s where the money is.


I’m really tempted to learn Swift and the macOS UI toolkits just because I’m 100% on macOS these days and there’s a few things in my life that I think could be better as a stand-alone desktop app rather than a web app or Electron app.


SwiftUI is really nice but I've found performance to be questionable (they seem to be improving it over time) and there are some things that are impossible to do in it that are absolute must haves for some application types. For example, there is no way to change the tab order of which text entries get focus.


unrelated, but nice username!


Cheers, lots of things from San Francisco get featured on HN but the Grateful Dead have to be my favourite!


> all are second rate compared to the browser

As a non-web developer I'm genuinely curious, what is a good UI toolkit targeting web browsers/Electron that rivals Qt?


With my next business I’m aiming to do the opposite to number two, I want to address issues in a very old, traditional market and find a few thousand customers delighted with a reliable slow moving product that solves a big issue for them, and if the product were to ever get “feature complete” that would be great afaic.


If I ever get around to making a SAAS or whatever, all new features will have a toggle with a prompt letting existing users choose if they want to enable it or not, including interface changes. (And new users can disable features they don't want to keep things simple. KISS seems to be completely ignored in big tech.) If I can't program cross-compatibility (keep things modular), then I suck as a programmer. If Word Press can have themes, extensions, or whatever, so can I.


Sounds good, but impractical from many standpoints the least of which is the programming aspect. Options for users are the the opposite of KISS (but very nice for them).


If the SaaS is complex, this becomes impractical if toggles for "all new features" is the goal. Maybe a subset is doable.

If it's a simple SaaS by all means go for it. I really like the idea of being able to toggle things I like/don't like. But most of the times I become a "second-rate" citizen in the eyes of devs when I'm on a lesser-used subset of features.

Example is this service I'm using to manage my small company's finance; because I refuse to toggle on a feature that makes life easier for most people (but not mine), getting support for bugs has been very difficult.


Good luck hiring enough people to maintain a billionty different versions of your service


Websites don't run multiple versions of Wikipedia do they? Everyone would be on the same version, they'll just see different things.


If there are 30 feature toggles, there would be 2^30 ≈ 1 billion possible combinations.

Finding the combinations that result in something broken (through some unexpected interaction of features) would be impossible via QA. I guess you could keep a list of the more popular combination of settings and test those. More likely it means you'll let the customers do the QA for you. The customers will not like that.


Not sure if you've ever worked on complex systems before, but features simply don't exist in isolation. Features interact - sometimes in complex ways, sometimes in unexpected ways. The cost of supporting user-facing toggles for every feature in your product grows exponentially.

If you've ever maintained a system that uses developer feature flagging, the cost is large and typically this is for feature flags that should only exist for a matter of weeks or less.


This is versioning for TeX. It is the value of pi with added significant digits for each new release.


And when the author passes away, the final version will become exactly π, and all the remaining bugs (if there are any) will become features.


This sounds delightful. Best of luck with this!!


Well come on, what business is it (if you're happy to share at this point)? Some people will want to prioritize it just because of this perspective.


It’ll be in the manufacturing space building on an app I wrote for my own factory a few years ago


I think it's also the cadence. You alluded to it in #1. I used to work for a traditional software firm years ago. We actually cut CDs and shipped them. We would do major releases twice a year and patch releases twice a year, yielding 4 releases a year. That was considered a lot. Now it's releases every 2 weeks or so.

The nice thing about installing your own software is you controlled the updates. If there was no compelling reason to upgrade and you were stable, you didn't. Most people today could probably break out a copy of Office 97 and be totally fine with it.


#1 is important since you can't generally opt out of the change but I think it is largely #3 which speaks to the rate of the change. Everyone has to justify their existence. Everyone has to justify their bonus. The C suite has to present something to the board each quarter. Making great money is not enough, we need infinitely more. Lots of times the revenues go up but so do the costs and the profit goes down. Then there is even more pressure to add more crap to get the profits back up.


> 2) The entire software world is built on VC money.

I don't think that is the case (depending of course on how we define "the entire software world") I've been a developer for 20 years and never worked for a VC funded company.

There is a tonne of software development outside the realm of VC's.


#4) Attention is the new money.

For a lot of software, like social networks, getting new users and new downloads is the main goal.

When a company is not charging [most] users money, the second best thing is publicity or grabbing and selling user's data. Both options require user churn. Keeping users happy for a long time is not their concern.

Unfortunately this is happening in the FOSS world as well. Constant churn of software products, features and bugfixes to keep users addicted.


> Unfortunately this is happening in the FOSS world as well. Constant churn of software products, features and bugfixes to keep users addicted.

I suspect there's so much churn in FOSS, because developers love rewrites and starting from scratch. In the world of commercial software, written by companies, the bosses/owners reign them in. FOSS, on the other hand, does not have bosses or owners...


No, most churn comes from company-driven FOSS, especially around devops stuff.

Among other reasons, they use churn as a way to control the userbase and prevent successful fork.

> FOSS, on the other hand, does not have bosses or owners...

Corporate-driven OSS has plenty.


An extremely exhausting world, with no end in sight. I just wish all the money would run out or something, give (most of us) a break.


Maybe we need a name for the opposite of software that is exposed to such a pressure to change: Stableware.


#4) The dream of reusable components is here. Every major dev system from Emacs through Rust has a vast library of component dependencies. These things are a swirling unstable mass, leading to spacetime fluctuations up at the human level such as supply chain attacks, mandatory upgrade cycles, and orphaned deps. Then the complexity leads to wrapping standard deps to standardize them, leading to multiple standards...

You know it's bad if there's multiple XKCD's about it.

https://xkcd.com/2347

https://xkcd.com/1987

https://xkcd.com/927/


As a Python developer who just bought the wrong USB to USB-something (whatever IPhone SE uses) cable just last week the second and third links you provided hit very close to home. I don't know anyone from Nebraska just yet, though.


#4) We aren't innovating in the same way that we used to so many of us are doing rather pointless tasks to justify our existence.


I secretly think every major change in every software is somebody's L6 promotion


This is bizarre analysis given that one tool being talked about here is Microsoft produced, which has nothing to do with VC money.


Yes, the focus on features, and ongoing lack of bug fixing, is maddening. This applies to big companies as well as small.


VC = Venture Cancer?


IMHO the answer is the tech version of bullshit jobs.

You know the concept - jobs that exist just to exist and pay for someone's living, though they produce nothing of actual value, and it takes effort to actually remove so no one does.

With software it goes something like this:

1) A product is built, something that solves a real issue. It's new, rough around the edges and has only the basic critical features.

2) Funding is found and a large team is hired to implement what's needed and fix all the rough edges.

3) The team comes up with good, new ideas that actually improve the product experience for everyone, so in the eyes of management and the investors they have merit.

4) After the first 10 good ideas are implemented, the next ideas are... not as helpful, but some users still like them and nobody wants to fire anyone who's been doing really good work so far.

5) Fast forward a couple of years, the team is huge and there is really no headroom to improve the product. Everyone tries to come up with bullshit ideas and force their ideas through, just to have something to show come performance review time. The users stay with their tried and true product, and they're so invested at this point that a few regressions make them bitter - but not enough to leave.

6) Rinse and repeat.

I've been on that team. The engineers there are often mentally older, comfortable, and smart - or at least were smart. Many of them are sure what they do is important. But they're wrong. They now hold bullshit jobs.


I agree, this is indeed very wise.

There's a second sort of form of this: even if your (first) goal isn't to make changes, but just to maintain what you've got (operations, reactions to security issues, keeping up with the new Android/iOS/Windows/Web 99.0 way of doing things) you might need a team of a few people.

But they won't need to be doing that 100% of the time. The work will come in spurts. You'll find something to do with the time in between, because you can't hire people just for the spurts. Thus sometimes-BS jobs.


I think. This is so true. At certain point the software is done. And the only way to "improve" the software, is to just build adjacent software to enter horizontal or vertical markets.


I'm in my 20s, but I definitely think so.

Back in the day redesigns were well thought out and kept as many of the previous design decisions as possible. Ever since things started moving to the web, this seems to have changed.

Layouts, methodologies, subscription model, payment options, login options, core application design, it can all change from one month to the next. Everything is constantly being A/B tested, there is no clear release cycle anymore.

Now that iOS and Android finally seem to have settled down on a design, Windows changes their UI style again. Web frameworks seem to stagnate (finally) but there is already a slow move back to integrated server side rendering frameworks.

Modern software development is based on "move fast, break ad many things as possible, get promoted or bought out by FAANG". The time of cheat sheets, manuals and curated workflows is over, everything is now SaaS/PaaS/IaaS/AaaS and the only way to use a computer is to constantly relearn your work flow. Reading about new features and upgrade documents is no longer optional, because the next update could substitute some of the old features you rely on with the new ones.

Your computer,tablet, phone or TV could update tomorrow and you'll need to learn the entire contacts manager or file manager or settings menu from scratch and there's nothing you can do about it. And those are the systems that undergo relatively infrequent UI redesigns.

There are some things you can do. Stick sith LTS software if you can. Stick with single purchase, self-hosted software if you can. Avoid anything with buzzwords ending in aaS on their homepage like the plague, and try to switch to something else when your favourite self-hosted tool switches to an aaS model the way Atlassian did. You'll still end up using tons of crap that switches designs because the design team got bored again, but at least it'll affect as small a part of your floe as possible. Oh, and consider disabling automatic updates until security patches get released. I'd be the last person to advice someone to skip security updates, but redesigns often come in small parts and rolling updates, and you can delay them a little bit if you skip the unnecessary updates.


The most irritating aspect of this are "feeds". God, I hate feeds. Especially the wishy-washy clever ones.

The only feeds I still use are Twitter and Spotify. The former is cancer and it's contents better be fleeting. But why on earth can't Spotify be predictable? It's start page is essentially useless. And despite all the clever people bing thrown at the problem, the Release Friday playlist is nowhere to be seen on Fridays. This can't be this hard.


Spotify's whole UX is a disaster, and it's puzzling to me why that's the case. They certainly have the resources and the high profile to attract people who can fix it. I guess they just don't feel the need.


Try using 2005-era iTunes. Dense interface, everything like a table, clear UX patterns for clicking an album and then listing the songs in that album, no suggestions or lengthening your curated playlist without asking you, no animations or fancy graphics. Ok, it used to crash a lot but that's an orthogonal problem.

Unpopular opinion: "Design" is ruining everything that was good. Most suggestions for good "UX/UI" on HackerNews are shit. So are all the articles posted here and advice given. Most UI frameworks suck. The whole thing is rotten to the core.


Yeah I've never been an Apple fan, but I used to run iTunes on windows because it was fucking great. Then they added the album view stuff and performance degraded and it became unstable.

I use Foobar now and sure Foobar takes a bit of power user level stuff to get it configured exactly how I like but it is a nice tabular music library I can search and then click on stuff to play. No bs fancy crap screwing it up


I'm not a Spotify user but the news that came out about albums being played with shuffle as default makes me think they don't care about music. Not surprised if everything else is misaligned as well.


IIRC albums _only_ play on shuffle unless you have a paid subscription. So it may have been some default bleed-thru, not a conscious decision.


Who knows what they're optimising for?

Other than the start page, Spotify is perfectly fine though.


There are other issues. The random/shuffle has never worked properly. Why can't a billion dollar company figure out that they just played the same song for me a half hour ago? It is so bad we are drifting away from Spotify after subbing for ten years.


But again, ask yourself- what are they optimizing for? Entropy, or generating revenue through pushing specific artists?


I think humans have a point where they have to commit to a choice to see what can be added upon. Back in the days everything was a long consequential change, you can't update 1M CDs, 1M cartridges, 1M car unless you bleed.

The intantaneous, differential nature of high bandwidth web gave near full freedom. To the point you didn't even started seeing what you can build with a thing as it's already changed.

ps: another weird thing, all the engineered methodological lead to incoherent changes. They're not progressing, they're jumping from local maxima to another.. Chrome and Google Apps UX regressed wildly multiple times. It's instability as a service.


I built a desktop application in Java Swing about 15 years ago. I dropped it because it was cemented in "old" tech. After trying to rewrite the app several times in various cross-platform frameworks for the web and mobile I gave up and went back to the old code.

I fired up the project after not touching it for ten years and loaded it in netbeans. It built and ran without a single issue. I was flabbergasted. If I leave my Xamarin Forms app for a month I come back to find failing builds, broken dependencies, dependency version mismatches, and OS related failures induced by updates. I spent countless hours hunting down solutions to these and it happened numerous times for different reasons. Visual Studio changes, OS changes, core framework changes, core language changes, library changes....omg.

I'm working in my old crusty Java Swing project and I love it. Each minute I spend working on it moves the application closer to done. There are awesome open source themes available now and it looks great. My old code to remember window sizes and screen/monitor positions still works. Unfortunately, that was surprising as hell.

I have the expectation it will work twenty years from now. Web app or mobile? Not a chance in hell.


Java is solid as hell. That's why (well that plus the ease of hiring) it's the default choice for most major corporations.


Developers in corporations are moving away from java at a frantic pace. A lot of it has to do with remote functions (AWS lambda etc) being very slow with such a heavy standard library and the runtime startup.

It's a mistake. As is the lemming-like rush toward remote functions for everything. Everywhere I go I see people hacking on the Lambda execution meta to improve initial start times--and eventually giving up--because the PM doesn't like a slight random delay. PM keeps complaining.


I like the Instability as a Service. We should be more vocal about how costly to the customers these services really are and we should do chargebacks for these costs. If many customers express the concern they’d perhaps do something about it.


I think it's pointless, the whole web is a big ship coast on its own inertia, usability doesn't really matter much.


+1000 for Instability as a Service!


When you use installable/desktop software you can also control network security and therefore application security via your firewalls and other devices. So the user is able to control secure access and contain any possible breeches...cheaply...instead of just pumping all their data to the cloud and hoping the SaaS provider knows what they are doing and has secured every single exploit and leak vector that comes with publishing a web-based solution.


Good advice and thoughts, thank you - much to think about!


This is what "business at the speed of light" looks like. Everything is sped up, including tool development.


It's not just you. I was the "IT guy" for a small marketing firm, with about 40 people to support. We were a Microsoft Windows shop. Once I got everyone settled on Office 2000, we didn't upgrade. We skipped Office 2003, 2007, 2010, right up until they outsourced my job in 2013. The users LOVED ME because I didn't change shit on them all the time. My workload kept going down until there was really nothing for me to do, because I didn't keep changing things.

Having stuff in the cloud, as a service, means you have no control over change. That is an unsustainable model for the end users. It'll take a while for the pendulum to shift back. Right now they can use the fact that you can't secure a Windows machine (ever) as a wedge to keep things managed... eventually this will be solved. (Capability Based Security) At that point people can run their own stuff again, and p*ss on the cloud.

You can still support it, you might even be able to make the case for bringing the stuff back under local control. It's just going to be a lot more work.

It's not your age, it's your intolerance of bullshit, that is at work here.

Good luck!


> Having stuff in the cloud as a service means you have no control over change

I think you might be onto something


So, no security updates? Your users didn’t complain they couldn’t open documents created in newer versions?


There was a patch released in 2007 that allowed access to the new formats from the older software... thus the users didn't have to learn new UI, and could still deal with new incoming files just fine. We really didn't have any trouble exchanging files with the rest of the world.

As for Windows updates, we did those.


The Office compatibility pack allows opening the x filetypes in older versions, but it does not preserve document fidelity. New features still aren’t supported in old application. So that doesn’t really work if you exchange files with a lot of people.

Windows updates don’t solve security issues in Office applications. If you were 10 years behind on security updates and you never got hacked you were either an uninteresting target or just lucky. Either way I wouldn’t be too happy with this kind of IT.

Then again even if you hadn’t been replaced your policy couldn’t have lasted anyway because the last operating system that supported Office 2000 was going the way of the dodo.


Technology, no doubt, can do amazing things. Google Maps can tell me when my next bus will arrive so I don’t have to brave the -10C weather. It’s like magic.

However, there is ONE important side effect that I have not noticed until now.

Most technologists crave speed. Faster processors, faster disk drives, faster networks, faster everything. Bottlenecks are our common enemy. YES. They are evil. I can relate to it because I spent a good amount of time in my career fixing performance problems for financial systems.

No one likes to wait for the computer to respond.

Unfortunately, this craving for speed (in technology) has quietly bled into other aspects of our living. People learn to speed read to gain more knowledge faster. People speed walk regularly (yes, I can also feel it in Hong Kong’s subway stations.) And the most crazy thing is: we don’t realize it until our body cannot cope with the demands of our speedoholic minds.

I watched this Carl Honore’s talk from 2005 (http://www.ted.com/talks/carl_honore_praises_slowness). 10 years later, it’s hard to believe that many of us (including myself) still get caught up in thinking “Slow is bad.” But no, there is such thing as “Good slow.”

You need to be patient in building a relationship; you need to have a clear mind in thinking strategically; and you need to be willing to spend time making mistakes in order to invent something useful.

So, please don’t let us, technologists, news or media slloowwlllyyy turn you into a speedoholic.


> No one likes to wait for the computer to respond.

Ha, you wouldn't know it from using a modern computer interface. I have a hobby of counting the seconds since when I click something on my phone, a computer more powerful than supercomputers were in the 2000s, and when something actually happens. It is often in the double digits.

This may even be a result of the thing you're describing: our desire for speed has bled into our 'productivity' as developers, leading us to sacrifice the user experience for the sake of (theoretically at least) building an application faster. Though honestly, I'm not convinced that's actually happening because we've introduced so much complexity in our effort to automate as much as possible that we've actually just made everything much worse.


Yeah, Microsoft Teams occupies 2Gb of RAM at start and switching between "chats" takes a hefty 15 seconds ALL THE TIME. I developed PTSD at the very thought I gotta change to a different channel.


My grandma never moved on from Lotus 123. She made a lot of money with that application. Excel and most new stuff "was just too slow" for her. I didn't blame or shame her a bit. She was correct in that people were trying to foist negative productivity on her--and she always proved her point to me. She is far too stubborn to accept a loss in productivity.


Heavily agreed. Our psychologies are becoming unrooted from our bodies' physical needs. Years of evolving towards the mental stimulation we seek these days is detaching us from our bodily sensations, and often we can't feel the bodily consequences of our choices until a complex chronic health problem arises. We're burning ourselves out on a global scale.


If you like Honore's talk, he also has a book of the same title. I read that a few years ago, and it really changed my opinion about a lot of things. I try to go slower now on things, though I still struggle a lot. Eating is a big one, though I put that down to 13 years of conditioning on 23 minute school lunches, coupled with 5 years of that while teaching. Such a hard habit to overcome.


Slow is smooth. Smooth is fast.


While no one likes to wait for the computer to respond, but the old fashioned software did respond faster on thousandfold weaker hardware.


Somehow the OP didn’t remind me of this pertinent article [0] but your comment did.

[0] https://avc.com/2009/10/slow-capital/


> -10C weather

Does it ever get this cold in Hong Kong?


I am a bit younger than you and I am a programmer, I too feel the same. I am not totally lost (yet), but I think I will get there soon :(

No politician wants to repair existing infrastructure (bridges etc), but every politician wants to build new stuff. Because new stuff is what gets attention. Same situation in software. If I make a particular old feature 3 times faster or if I fix a 3 year old bug, nobody is going to take notice. But if I add a new feature, it is going to be noticed, whether the feature is needed or not. Then quickly forgotten, only to move on to the next "feature". This is what happens when your bonus, promotion etc is tied to shiny new stuff. People do what they need to do, to get ahead :(

As I grow older, I am more and more appreciative of things (anything - physical or digital) that do one thing and do that one thing very well. When I was a kid, my dad had a bicycle. That thing weighed a ton, looked butt ugly. My family abused that bicycle to the max, and it just worked, with almost zero maintenance. Same with every household item we had. They were basic, but they worked flawlessly, for a long time. And the reason they worked well was the absence of useless, stupid features that nobody needs.

I don't know what the solution is. But I am just tired. This doesn't even take into account the shiny new, half baked, undocumented tech that comes out every day and gets adopted for no reason.


This is precisely how I feel about Bluetooth. I've actually switched all of my headphones back to wired 3.5mm aux connections because it does one thing very, very well -- it's absolutely predictable. I can't stand how Bluetooth tries to support speakers and watches and thermostats and everything else on the planet that your phone could possibly connect to.. poorly. After a couple of years of messing around with audio playing out of the wrong device, calls where the audio disappeared into the ether, and built-in announcements about "low battery life" every 5 minutes when below 20% charge, I realized that the jack-of-all trades Bluetooth is just a poor solution for audio. Using the headphone jack has been a breath of fresh air. I just plug in, and start the audio or hop in my meeting. Easy as that.


Unfortunately, Sony headphones don't even support the mic over wired connection :(. It's (terrible audio quality) Bluetooth or nothing.


> This is what happens when your bonus, promotion etc is tied to shiny new stuff.

Maybe some of that is happening, but it's also easier to build new features than to evolve existing ones. (I suspect this is also true for roads and bridges.) So even if bonuses/promotions were tied equally to old versus new, the path of least resistance skews people toward new.

In software at least, I think much of the problem boils down to developers being unable to read or understand the project codebase, partly due to devs not prioritizing code readability, but also because the tech industry has an endemic problem of devs rotating through jobs every 1-2 years.


it's also easier to build new features than to evolve existing ones.

Yes, that too

endemic problem of devs rotating through jobs every 1-2 years

I work as a contractor. I prefer long term contracts, even if they pay less. But the past few months, every job requirement I have gotten is for 6 months, some as bad as 2 months. What a person can achieve in 2 months, I do not know (unless it is extremely well defined job and the contractor gets help). Maybe this is because of the pandemic, I don't know. But employers and employees both deserve blame for this high churn.


Software isn't driven by users' needs, but by how many tickmarks the marketing people can put on the brochure. This is doubly true for enterprise software, where the user is almost never the person who purchases the software (and vice-versa).

Companies struggle to get big by serving their customers well and outperforming their competitors, but once they reach a certain level of success, this is no longer necessary and products fall into a state of neglect. We saw it with Microsoft years ago, and it continues to this day, and we see it with Google.

The biggest reaction to new versions of Windows in the last 20 years or so has been "I don't want it." (with the possible exception of Windows 7, which was the last version that was significantly better than the previous version).

The most common advice I hear about Google is to not allow your business to depend on any of their products because they will pull the rug out from under you without warning when they get tired of a product, or when they replace it with something that's greatly inferior.

And in an era of instant updates, there is no incentive to make products robust and stable because it can always be patched next week.


You (we) may be experiencing future shock: "too much change in too short a period of time", from 'Future Shock', Alvin Toffler, 1970.

FTWA [0]:

> Alvin Toffler argued that society is undergoing an enormous structural change, a revolution from an industrial society to a "super-industrial society". This change overwhelms people. He argues that the accelerated rate of technological and social change leaves people disconnected and suffering from "shattering stress and disorientation"—future shocked. Toffler stated that the majority of social problems are symptoms of future shock. In his discussion of the components of such shock he popularized the term "information overload."

[0] https://en.wikipedia.org/wiki/Future_Shock


At 52 - and still developing stuff, still advising, etc. - I'm more suffering from "Future Annoyance" than shock. I'm despairing at the seemingly huge wastage going on - the rapid shifting from one javascript framework du jour to the next, the rapid versioning of e.g. Qt from 5 - where I ported one of my apps - and bang we're already at Qt6 now ffs!

And on and on it goes. It seems multiple 1000's of man-hours of work happens in the tech world, only to get dismissed and dumped in favour of $new_exciting_thing, and it's getting annoying and disappointing.

It seems to be a case of rather than iterating on something, someone else comes up with $new_thing and that becomes the fashion du jour for a period of time - lots of people quickly jump ship to the new thing - until /that/ stuff gets dumped in favour of $some_new_new_thing. Repeat ad-nauseum.

I'm undecided on whether that's a bad thing or not - but the pace of the change(s) is what I'm finding more Annoy than Shock.


"Future Annoyance" is definitely the word. I know it's not just me because it hits our entire team all at once when something stops working.

We deal with a lot of clients so we use Skype, Teams, GotoMeeting, Zoom, you name it. But Skype regularly causes collective "WTH!?" in our team. Skype devs seem to love moving things around without warning. Or putting features like "ring group" being an extra button after you "call group". Most recently they moved how you close a picture when you're in a call. Why they did this I have no idea. But I saw it catch multiple people on my team.

Honestly, I'm tired of it. The cognitive load of all of this is too high. I have work to do.

edit: I should add an observation. Whenever a feature is first changed or simply moved, it almost certainly means it's not going to work. Quality in this day and age is non-existent.


At my previous job, we used Webex Teams. It was fine when it worked, but it always had annoying bugs. There was a new release every month or so, and while the annoying bugs often went away, they were inevitably replaced by new annoying bugs.

At my new job, they use Microsoft Teams. While I haven't experienced most of the problems being described here, I'm not particularly impressed. I did have to spend about 15 minutes one day trying to get a call to work with a coworker, because it kept totally locking up. I finally solved the problem by disconnecting my microphone, even though I had used it successfully earlier that day. Teams working again with my Samson mic, but who knows when it will poop the bed again.


Kids nowadays have no respect making single-page applications on their iPad. Seriously though, the old days of writing software from scratch are gone. This reliance on frameworks and the associated toolchain complexity is a big mistake. There is a reason C is still loved after 50 years and that is because it is simple. It's too bad we're going backwards.


"C is still loved"

Citation needed.


:waves:

You can cite me for one! I didn't even grow up on it, it was honestly a more recent thing for me to pick up seriously, I wrote significant amounts of JavaScript, Ruby, Go, Rust and various LISPS before I seriously used C in anger. Nowadays it's the language I reach for second most often after JS and it's honestly something I often really look forward to writing. I love how it just gets out of my way and lets me do things, it's so fucking refreshing.


you should checkout zig, its the closest thing to c with a lot of the rusty edges removed


I'm undecided on whether that's a bad thing or not - but the pace of the change(s) is what I'm finding more Annoy than Shock.

The most dangerous aspect of this phenomenon is that the platforms our applications run on are now highly unstable. We live in an era when software like operating systems and browsers will happily update itself whether we want it to or not. You can't build a castle on sand and expect it to stand when the tide comes in.

At least if the platforms provided stable, standardised foundations, those application developers who wanted to provide software with longevity and compatibility could do so. But the likes of Microsoft, Apple and Google seemingly have no interest at all in providing those stable foundations any more. If everything is throwaway junk then everyone needs to buy the new throwaway junk next week too.


> Qt from 5 - where I ported one of my apps - and bang we're already at Qt6 now ffs!

It was 8 years between the release of Qt5 and Qt6. Qt's one the things I consider quite stable compared to many other frameworks these days.


I think you mean from Qt4 to Qt5.

When I first started writing Python apps using Pyside/Qt4, it was just before the transition to Pyside2/Qt5.

I put off moving my Captain's Log app from one to the other until I couldn't put it off any longer. Then, it seems only a year or less, we have Qt6 now.


Qt5 was release in 2012. Qt6 was released in 2020.

https://en.wikipedia.org/wiki/Qt_version_history


Thanks.

I think my timing was more down to the production readiness of Pyside2, which meant I had to hold onto using PySide and hence Qt4 before PySide2 was ready/mature enough to trust porting my existing app to.


I'd read that book for the first time only about a year ago. It has some warts, but in general has aged spectacularly well.

It is weakest on predictions of the significance of specific technologies, many by those directly involved in their development or promotion. Some of those have panned out (most notably, of course, information technology), but most haven't, proving impractical or nonviable, occasionally socially unacceptable.

The cautions about psychological overload are strongly on-point.

Where the book is likely underappreciated is in its predictions of social change. These have largely been so successful, especially about youth culture, gender, and race, that the prior state seems utterly foreign to anyone born afterwards. The predictions to a large extent are our present reality, and seem trite simply on that basis.

(This is a general challenge in forecasting: unsuccessful forecasts are glaringly obvious, successful general ones are too ubiquitous to be apparent.)

Toffler cites a great deal of research (though in following up on his footnotes, some citations seem inaccurate), and both the original sources and subsequent works citing them are of interest. He leans particularly heavily on Herbert Simon, a polymath (psychology, sociology, economics, mathematics, AI) with many powerful insights (and a few blunders).


Blaming the majority of social problems on it is just plain silly. Take any given time period n from present day t - n and expand it into a stasis for decades. It wouldn't evaporate social problems if we were stuck using dos, let alone the majority of them.

The actual stressor that gets fixated upon and scapegoated appears to be something ironically very old and unchanged - memento mori. The Eiffel Tower was called hideous and something to tear down to those who knew the city long before its placement. For those later it was a part of Paris and tearing it down would be sacrilege. Beyond a certain point in life everything new becomes a marker of your obsolescence as you personally remember how things were before. Those after you never would.


This comment doesn’t address the implications of the rate of change constantly accelerating.


No, this is not future shock - quite the contrary.

Future shock comes from "enormous structural change". As David Graeber pointed out, major structural change and scientific breakthroughs slowed down to a crawl.

E.g. we don't have teleportation, warp engines and computers integrated in our skulls. Not even flying cars. Compare it with the changes between 1910 and 1960.

Here we are seeing the opposite: endless churn and reinvention of the wheel around very minor stuff.

Graeber got a lot of backlash for pointing out this facts. We are not supposed to criticize the gods of technology.


Smartphones are basically computers in our skulls, a radical shift in human existence from not looking at computers for most people to being tied to one for hours a day. You can pretend the world isn't rapidly changing but that's just a bias to living in the present. Of course we haven't invented fantastical physics defying devices yet but the technological difference between now and 1970 is just as start as the one between 1910 and 1960, just in different ways.


> We are not supposed to criticize the gods of technology.

The downvotes are coming as expected.


I think there is a trend of everything being geared towards new users (i.e. growth) - as soon as you become a user, you're less important. For instance, I noticed in many websites, 'Sign-up' is always featured more prominently than 'Sign-in' button. It feels like UI changes to be geared towards new users who are used to the UX 'flavour-of-the month', lowering the initial learning curve as much as possible even though it will confuse existing (captive?) user base.

The unfortunate side-effect of this is that users will never become proficient in the software tools, with 'power-users' being a thing of the past. As an example, I think MS Office peaked for me probably around 2010 in terms of features and my proficiency and now I'm actually regressing. If I want to export a PDF of the file, I can't remember if I should go to 'File -> Print', 'File -> Export', or 'File -> Share', because it keeps changing.


No helpful content on my part, but I just want to say how much this question speaks to me. I'm a 54 year old statistician/business analyst and have always embraced change and technological improvements. It seems about 6 or 7 years ago things started to shift where many of the tools I used started making interface and functionality shifts with little to no documentation or support. Apps would update overnight, and the way developmental tasks were executed just didn't exist any longer. Then I would just wait till someone on reddit etc, found a solution or where the functionality was buried so that I could move on. I really hate turning into a curmudgeon.


This is what drives me nuts. So many web apps have perennially obsolete documentatio


I hear you. I mostly get frustrated by changes that aren’t actual improvements, just change for the sake of fashion. I also get frustrated by “new” things that aren’t new. Everyone is all hot now on “web3”, and talking about decentralization. Well... the original web was decentralized. The centralization of the web to Google, Facebook, etc is a result of business forces, not technical constraints. Anyone who thinks that some technical solution is going to cure the centralization of money and power is hopelessly naive.


Thomas Friedman's "Thank you for being late" is a massive treatise on the idea that things aren't just changing fast, but that the rate of change is accelerating.

https://www.amazon.com/Thank-You-Being-Late-Accelerations/dp...

For a while after reading this, I believed that maybe we were in a bit of a slump and the rate of change wouldn't continue to accelerate. Now, with the societal / technological change brought by covid, the recent wave of building and excitement around Crypto (NFTs, DeFi, Web3), the continued jumps in AI, and a possible shift from phones to AR/MR glasses in the next few years, etc it's hard not to feel that the acceleration is continuing.


Consider the two dimensional curve, parameterised by t

x = a * sqrt(t) + r * cos(t^2) y = b * sqrt(t) + r * sin(t^2)

If you are monitoring long term progress towards (100a, 100b) you are disappointed to see that progress is slowing. And yet the short term whirl is whirling ever faster.

I suspect that a weak version of Sapir-Whorf is in play. It is difficult to describe the equation in prose. The limitations of natural language handicap us in our efforts to defend against the pathology it exemplifies.


> I suspect that a weak version of Sapir-Whorf is in play. It is difficult to describe the equation in prose. The limitations of natural language handicap us in our efforts to defend against the pathology it exemplifies.

I love the fact that you gave it as a parametric equation but I think at least a sense could be given in English "When something is spiraling towards a goal (think a bit of fluff going down a drain or a spinning coin slowly settling) you can find it simultaneously going faster and faster while still always taking longer than you expect to get there."

Or, to quote Art Garfunkle:

>But the ending always comes at last >Endings always come too fast >They come too fast >But they pass too slow


I have an alternative view from a lot of commenters. This is a symptom of tragedy of the commons + data driven decision making. Product Managers (of which I am one) are focused on advocating for changes that benefit the users. Unfortunately, as scale increases it becomes impossible to just have conversations with users and extrapolate from that to a tastemaking decision. Instead we have two possible pathways as an app grows:

1. You make decisions by aggregate telemetry. This leads to a degradation to lowest-common denominator, and changing UI to drive statistical movement.

2. You make decisions by intense user research with a small subset of key customers. This leads to "enterprison" disease and if done too early in a company's lifespan, can kill the company from ever growing beyond effectively being an outsourcing partner for their largest enterprise clients.

When you have 25 million users, you can neither have a conversation with all of them, nor can you please all of them, so you end up having to make the best decision you can based on the data available to you, and we all suffer in small ways, but hopefully benefit in larger ways.

It's hard to understand for those of us (I'm nearly 40) who grew up working with technology, because the changes have happened in leaps and then suddenly in tiny increments constantly, like boiling a frog, but along the way the number of humans using the Internet and technology massively massively massively increased. The scale of the Internet today is unprecedented for any previous product or technology in known human history. Fundamentally, that means everything must either specialize and focus on a niche set of customers or it becomes driven by the tragedy of the commons.


Thinking as a user of various software, this is all I want

1. As close to zero bugs as possible, aka, it should be dependable

2. Work as fast as possible. How much time humanity is collectively losing, because JIRA takes its own sweet time to load? :(

3. Clearly communicate what it is trying to do, and just do it

We don't think about stuff that work well, all the time. It just becomes a part of our routine and we forget them. Would be nice to have software that get out of the way instead of taxing our brain to think about yet another menu item, yet another feature....


Spot on - anything else renders software less useful.


> Fundamentally, that means everything must either specialize and focus on a niche set of customers or it becomes driven by the tragedy of the commons.

Very true. Increasingly I get the sense that almost every piece of consumer software I use is not really as concerned with what I want vs other, much larger demographics.

That's fine, but it is a bitter pill to swallow after feeling "at home" in technology for a long time.


>This is a symptom of tragedy of the commons + data driven decision making. Product Managers (of which I am one) are focused on advocating for changes that benefit the users.

This is true of good product managers. Unfortunately there are far too many who are (often stupidly) focused on short-term conversion gains, often to the detriment of the user. (e.g., let's hide the Foo Widget because data shows users who click the Foo Widget convert at a lower rate!)


The most frustrating part to me is the shear amount of mystery meat navigation. I cannot count the number of times daily I cannot figure out how to use an app because of menus hidden to the sides, lack if scrollbars, or otherwise no indication that something is clickable.

Maybe I'm just old and curmudgeonly, but it feels like ui design has become a cess pit of ever changing ideas.

It's also beyond frustrating to me when things that should and could work together don't because they're from different vendors ans we can't dear give users a good experience because that might allow other companies to exist and users to be empowered.


> Maybe I'm just old and curmudgeonly, but it feels like ui design has become a cess pit of ever changing ideas.

Taking away users' skills is objectively bad.

Up until the web, most UI elements were organized hierarchically, written in native language, had keyboard short cuts, and were searchable with help, themable yet standardized. Those things were called drop-down and context menus! The structure and options could literally be described in just a couple kilobytes of data, and most programs followed guidelines so they were deltas of each other. Now the whole thing is a multi-megabyte shitshown that's a vortex of confusion and rests upon a layout/rendering engine that few people on Earth understand. Every GUI is different, and they all suck. We're in the era of user "experience"--horseshit! It used to be called user interface (because you know, hey, users actually interact with the dang thing). I wish UI designers would go back to boring things that are usable. I don't want to live in a multi-media Luis Vuitton commercial FFS.


I feel like that one (UI elements going away) is also a case of evolution in UX and how we interact with apps. The best example there is when Apple moved from skeuomorphic design (which is also the first time that word became well known) to flat (?) design.

They realized that they were at a point now where users were used to e.g. buttons not looking like buttons, expecting to be able to swipe or scroll without a clear 'you can swipe / scroll here' indicator, so they did away with a lot of those things to be able to put more things on the screen (without cluttering it).


I'm not again change in general, but I think it's very arrogant to assume even the majority of people know when you can scroll or swipe. Sure, people moght try out of desperation, but random actions to figure out the ui isn't a good ui.


I'm 59, and I am able (just barely) to keep up, but I have had to specialize.

When I was younger, I was able to understand (and work with), a significant swath of the industry.

Shallow and wide worked for a long time.

That has not been the case, for many years.

Dependencies help to give people the sense that they are able to "keep up," but that's because they abstract the complexity. There are a few prodigies that can handle wide and deep; but they are rare exceptions.

I like to have a deep understanding, so I have specialized in native Swift development on Apple systems. I am proficient (but not a wizard) at PHP-based servers. I know enough to create backends for my apps, which work well. I know that others can do better.

This is not unique to software. All industries have been like this for over a century. Software had it easy. It was new enough, and small enough, that many of us could understand a great deal, about the entire landscape. This is how new tech always starts off. It used to be that every automobile driver was also a mechanic (out of necessity). I think the Wright Brothers were bicycle mechanics.


Its not the rate of change that is the problem. Its the quality of the changes and the way they are being implemented.

In all of your examples you keep mentioning that they either remove or change something you like thus forcing you to do something to keep up. That is the real problem, they are making changes that the user is forced to deal with. If all of these changes were fixing things you were legitimately annoyed with then you would be a happy camper, but that doesn't seem to be the case.


100% agreed. Whenever a feature is moved or changed, you can almost certainly assume it's going to be broken as well.


None of the product managers in those companies you cited are paid to stay the heck out of features that's already working. They are paid to scale those features, add more usecases and re-shuffle the design so that all the latest use cases are captured. This means the softwares are evolving at a pace the industry hasn't seen before.

And there's another dynamics at play. Design community these days are infected with apple driven thinking, treating users as stupid and worrying about decluttering. The focus instead should be on how will a design scale for known unknowns and unknown unknowns? How will complex usecases compose well with simple usecases? What happens when an unknown error happens? These are stuff that nobody seems to worry about in the UI design sense, IMO.


Before SAAS (all the products you mention?), people could reject changes by refusing to update to the next version - see Windows Vista. Now that feedback mechanism has been lost, so there's less of a handbrake on developers pushing forward too fast.

Maybe the pendulum will swing back towards self hosted/installed apps. Alternatively maybe there's a good business model cloning the "good old version" of popular SAAS apps and keeping them unchanging.


I think software companies feel under pressure to fill the extremely large boots of their wildly expensive SaaS plans.

I can only hope the next wave of "innovation" is "pay only once for your static needs!".


> Alternatively maybe there's a good business model cloning the "good old version" of popular SAAS apps and keeping them unchanging.

To some degree that's already happened, with e.g. the myriad of feed readers growing in popularity after Google shuttered theirs. And thanks to GDPR, a company is required to hand over data to users if they demand it - which the 'clone' can import.


Apart from the current answers let me give another reason; product managers (I'm also one). We are excellent at finding niche corners and performing AB testing and this and that and then making a business case to increase whatever metric whatever much. This is not much to do with the tooling but the current climate of doing business. On their own right, indeed metrics go up and profits again demonstrably increase.

However, this is (100-eps)% of the cases, demonstrably, worsens every product or at least causes its initial innovative aspects. The reason is that everybody thinks locally and makes the assumption that the function is monotonically increasing and when hit on a plateu or decline, make extreme changes to the product (and calling it disruption in the meantime). And much to my regret, management is one of the worst accountable professions similar to Human Resources in our era. If you listen to PM courses, cringe worthy practices are sold as success stories. On top of this, we have much too power with very little accountability hence following the entropy principle, much higher probability of doing the wrong thing.

Writing this while I desperately wait for Jira to render the page.


> We are excellent at finding niche corners and performing AB testing and this and that and then making a business case to increase whatever metric whatever much.

I've always had my reservations against this approach but at this point I am wondering if data driven product design produces positive results at all.

Clearly YouTube PMs did run experiments before deciding to kill the dislike button but didn't pause to think what the real purpose of that button was supposed to be..


True but you might surprised what that dislike button is actually correlated to.I don't know but I can guess. Often "content creator experience" is way less important than preventing some companies having their campaigns downvoted to death. So there might be a strong commercial incentive which also youtubers don't pause and think who is benefiting from this.


They already had the option to let the creator hide dislike counts.

YouTube is less useful now that I don't know what's clickbaity scam.

All these PMs and not one of them thought.. "hmm.. would we remove product ratings if we were Amazon"


I'd like to point out that "performing AB testing" is itself exactly the problem described by OP. Change some functionality out from under some users, to find out how they adapt. It's essentially performing ad-hoc psychological experiments with your regular userbase.


Yes but I don't see your objection. That's basically what google, apple and alike are by definition. The marginal benefits and the techniques used to reach them are very sound. The cost function they choose is not.


I don't understand the distinction you're trying to make. The past alternative was to hire UX testers, do focus groups, and have alpha/beta testers. It's obviously advantageous for companies to instead push this work onto the users, and the cost function they choose of course doesn't take into account what are now externalities.


There are way more details to the AB testing over. UX is just a percentage of the total product. Users are being subjects because their behavior is what you make money out of. Is it moral that's a different discussion.


The UX part is mutually beneficial, so it's at least possible to make an argument that companies doing so are benefitting users more than traditional UX design processes would. I still think it's creating externalities for users and would appreciate informed consent (something along the lines of bringing back proper versions where nobody would ever be forced to upgrade).

Optimizing to the detriment of the user (eg for engagement, conversion, whatever) is straight malevolent and I've yet to run across a moral justification for it apart from market nihilism.


W. Edwards Demming is strongly recommended as an antidote to the local optimisation / global Schlimbesserung phenomenon.


I'm in my 50's and I think it's our age.

I started coding in the 80's, and got my first coding job early 90's, and the tech was moving even faster then.

I now have computers that last more than 3 years without becoming completely obsolete.

There's been one fairly consistent programming environment for at least 10 years now. We had "micro" in the 80's , Object-oriented in the 90's, internet in the 90's, and mobile in 2007. I spent less than 10 years making desktop applications in VB, and then both of those things stopped being at all relevant, while I've been coding web stuff in Go for 10 years now and it's still very relevant.

But I do find changes annoying now. I used to embrace it all, and be keen to learn all the new stuff. Now, not so much. I don't know whether I'm jaded with experience, or just old and want the world to stop changing around me.


Also in that age group, but I was tired already back in 1995. I was an enthusiastic hobby Win32 programmer. Win32 was considered "hard" back then because it was a flat set of 1000+ functions, plus an event-loop-based programming model. It had a complete printed reference over two thick books. But then to make Windows programming easier, MS introduced MFC (Foundation Classes). Well that had 7 or 9 reference books for it, equalling 3 times the total thickness of the Win32 ones. Since then I've done a lot less Windows programming.

Anyway, that's just one of a million examples of this phenomenon. You have to keep learning more things just to be able to make the same stuff you could have made 20 years ago with simpler tools. This doesn't bother some people, but I think that set of people is getting smaller.

In my opinion this is the reason why women, who used to choose programming careers in the 70s and 80s, do not get involved as often today. Only people who are obsessive can continue to care about all the new versions of every software product or API, despite them offering little improvement. And the people at the high end of the obsessiveness curve are primarily men. If you think a healthy work environment needs proportionally more women, fix this.


> I'm getting on (I'm nearly 50) - not a software dev (thank god) but more a project manager. I do a lot of the "knitting together" type work between developers, UX people, designers, content owners, etc.

In your org, do you give developers fat sweet bonuses for simply maintaining things, or more like for building new features? Because that's how those big tech companies operate.


Another force: convincing, sociable magpie developers pushing upwards and sidewards to pursue the next new shiny thing. Not because they need to, but because they want to work on shinies.


I run a tiny (2 person) digital agency, so we don't actually have developers. I mean, we outsource lots of work. Also my sector is non-profit, so the skew on big tech is a bit different (it's less about profit and more about engagement).

...and still I notice the pain of change all the time!


You are probably not the target clients for those product companies, that's why they can ignore your needs.

I think in general we are going through very strange times. Look at games - most of them are either released under-baked ("early access") or exploit your instincts to make you spend more than expected ("freemium") or are essentially making users pay for the same thing twice ("re-master").


I think there are two distinct phenomena happening:

1) Rate of change in fundamental technology.

2) Feature churn.

I don't think #1 is actually changing all that fast, compared to previous decades. Consider the period 1991-2001 and the period 2011-2021. I think technology change in the former was much, much faster than in the latter. A typical PC went from {486, DOS, 1-4MB RAM} to {Pentium 4, WinXP, 256MB-1GB RAM}. Linux had only just launched in 1991. ~Nobody had a cellphone in 1991. ~Nobody was on the internet in 1991.

But look at 2011-2021, and is anything really that different? Computers are faster, but it's nothing like the growth rate of the 90s. iPhones, Bitcoin, GPUs, broadband, cloud, Minecraft ... we had all these in 2011. They're just incrementally better now.

Fundamental tech is still incrementing but revolutions are few and far between.

#2, on the other hand, is in its golden age. And it's all for the wrong reasons, largely articulated by others on this thread. My addition: our ability to create new software has outpaced our ability to think of new ideas that are beneficial for users.


> our ability to create new software has outpaced our ability to think of new ideas that are beneficial for users.

Great insight. I used to put it snarkily as "developers gotta develop" but your description is clearer and I think gets at the root of the problem: Companies have all these developers sitting in chairs, and they need them to do something. But the Good Idea faucet is not flowing sufficiently to keep everyone busy, so instead of 1. having the developers fix bugs, improve quality, tighten security, etc. or 2. declaring the product finished and moving on to another one, companies are choosing to 3. Turn on the "Bad Idea" faucet and just keep changing for the sake of changing. Almost every major "legacy" product 5+ years old is in this churn-for-the-sake-of-churning phase, and as a user it's awful.


Because modern day companies are broken. They hire engineers whose raison d'être is to "work" and pump out features, regardless if products reach a level of stability and maturity that most would considered finished. The entire ecosystem is busted. Look at Dropbox for example. Steve Jobs famously said that it's a feature not a product. Why do they hire thousands of engineers and keep adding things that no one really wants? Because they can't just fire people and so execs and PMs keep generating fake work until products bloat out of control. What else are they going to do? Are they really going to admit that there isn't anything else left for them to do and get laid off? Never.


I'm glad I'm not the only person who immediately thought of Discord when reading this. If there's a great seamless syncing storage alliterative I'm all ears.


I think it has to do with too many products being companies.

If your SaaS company is one product, then you HAVE to keep developing, and tweaking, and optimizing.

If you claim you're "done" you have to downsize to a maintenance skeleton crew.

So instead we feature bloat and everyone has to have this huge dashboard.

If instead we had something like Johnson and Johnson for software this could be avoided. They'd just make thousands of Saas products that do a single thing extremely well


First, I was thinking that it looked like the first signs of https://en.wikipedia.org/wiki/Technological_singularity.

But then I realized that everything you rely on is proprietary software which puts profit over user experience and therefore breaks working UX to get even more profit (something like this: https://news.ycombinator.com/item?id=29454289). Consider using free software alternatives instead [edit:] whenever you can.


Reminds me of the Scorpion and the Frog:

https://en.m.wikipedia.org/wiki/The_Scorpion_and_the_Frog

Why would a corporate owned proprietary app or website ruin its own interface?

Because that is its nature, it's fundamentally only in alignment with user wishes until it's half way across the river.


Thanks, but I'm not sure that's either true or a connection that is always entirely valid. For starters, my use cases are often driven by my clients. I don't necessarily have the choice to "not use Google Analytics" - all my clients do, as does the sector I work in.

Secondly, Wordpress - open source.

Thirdly - sadly it's the case that very often commercial options are better than O/S ones. Trello is (was) a good example. There just isn't a good O/S tool to replace it. Similarly, in my long experience working with projects, the O/S offerings are fairly poor, often not very well maintained and lacking in core features.

I know I'm basically asking for gold here: "I want all the features but not too many / all the time" - but the rate of change just seems to have increased to such a huge extent recently and I'm not sure users are ever really at the centre of these changes.


Many fine comments already. I think much of it boils down to the same thing that happens everywhere else. People forget the “spirit” (the point) of a thing, and focus solely on the “letter” of the thing. They lose sight of the why, the problem that’s actually being solved, etc. Software companies, especially VC backed, are caught up in growing revenue (or similar) toward an exit, which is essentially never the spirit/point of anything non-parasitic. Trello is a fine example; after the acquisition they were run by people completely incapable of recognizing bad ideas and “good but wrong ideas” because they were not at all aligned to the “spirit” of Trello. Happens with governments, entertainment, religions, products, pretty much anything where people who don’t understand and get on board with the “why” start mucking about.


More than once I've found it easier to move to a new vendor than learn how to use an updated site. Especially when a site I use is acquired by another and they want me to migrate my account or whatever. It's like, great, your founders got bought out and now you've made it my problem, congratulations I guess.

Disclaimer: I, too, am old.


I too am old, but I also do the same thing. We know that once something that was simple and great at its job starts being ruined, it's over. It will become more and more bloated and as it starts shedding users it will happen with increasing speed.

So, time to find a Trello alternative that is still just a board with columns.


I think the rush to cynicism clouds the bigger vision.

Things are changing fast because the challenges we are trying to solve are getting more complex and the tools need to be used by a larger swath of people in more subtle contexts.

This idea of "back in my day we had real wrenches" is exactly what you fear it is. Sure, you could build a house the same way they did in 1850, with crosscut and rip saws that you had to stop and sharpen every day, and nails you forged yourself. But today we need to 3D print houses out of concrete because there aren't enough trees, because all the old growth tight grain fir was clearcut over the past few centuries and even green wood grown to full size in 5 years instead of 100 is too expensive.

But I think the vision is faulty to begin with. When I started using computers in the 1980's for work, there were multiple text editors, and they all behaived differently, and yes, would go under or get upgraded. I loved working with PFSWrite on the Apple //e, but that eventually went under and we moved to IBM PCjr's running WordPerfect. It was a pain in the ass, why did they have to change my favorite text editor?!?

The pendulum probably won't swing for another 5 decades, when things become more "stable".

Also, I'm pleased to see so many people who are my age (mid 50s), and also kinda bummed to see so many people who are like me replying. Either it is endemic to us, or only we care about it and the diversity of HN is off on other discussions leaving us to kvetch.


Things are changing fast because the challenges we are trying to solve are getting more complex

Are they, though?

The average SAAS today has nothing like the complexity of a major desktop application from 20 years ago, from a user's perspective. Most of them are glorified chat apps or CRUD front ends.

Compare that to something like the DTP software used to do complex magazine layouts, or the CAD software used to design ships or skyscrapers, or AAA games back when the AAA games actually invented new gameplay mechanics and unique presentation styles instead of mostly being this year's version of Franchise X built on engine Y.

If anything the trend is for most application software to become ever [simpler|more dumbed down]* over time. One of these days mobile apps will only accept input via Morse code, because Google will have declared that the new state-of-the-art UI shall be a single invisible button filling the entire screen.

Of course more simplicity in the presentation is probably a good thing for many types of software. We just shouldn't need the current house of cards to build it.


I notice the same. I don't think it's an age thing. Products change their UI / UX often and it takes a mental toll to re-learn using them.

The Basecamp founders said in a book or podcast that they intentionally keep old versions of their website UI around, for exactly this use case – to avoid forcing people to relearn a UI. I don't know if this is true anymore, so fact-check me on this, but if it is: consider giving them your business.

----

What's the underlying reason? I can only speculate, but having worked with product teams for a while, and gone through several redesigns, I suspect it has something to do with: new leadership coming in, and risk-adverse, unimaginative PMs.

New leadership because that's the #1 project to get the troops rallied behind you and to have that quick impact to put on your promo doc – get a new design out, make your stamp on the product, regardless of whether it's good or bad.

Risk-adverse, unimaginative PMs because a redesign is a safe product change, and similarly, a quick win to put on the promo doc. If you lack any ideas of where to take the product next, a redesign is an evergreen solution. You could spend time investigating customer's issues, having interviews with them, analyzing competition AND be "unproductive" while you do this (i.e. not have anything to show for it for a while), or you could be productive and do a redesign.

There may be other incentives that are misaligned, but those are the two I have noticed most pronounced.

My 2 cents. Curious if others have thoughts on how to solve it.


Honestly, I'm really worried about the bloat most of the software is getting.. Plus the terrible state the products are coming. I'm not just talking just ordinary every day work software but also games.

In general the software stacks are getting bloated, overcomplicated and new updates/features are coming out really supper buggy.


Do we change UIs for no good reason? Sure. Far too often? Yes.

But the underlying approach of using cheat sheets to walk step by step through work tools is not a tenable one and isn’t for cloud-based software (i.e. software that changes without asking first).

Instead cheat sheets should be for core concepts (a purchase order has these components… a packing slip must be compared against the order quantity, etc)

That way you know what everything is, and even when the UI changes you know what concepts you’re looking for. If the UI is any good, it’ll surface the main stuff and tuck away the rare stuff, but it’ll all still be there (if it’s suddenly not, that’s when you have to complain or switch vendors).

When I was a kid my elderly aunt would ask me to teach her how to check e-mail or open solitaire by writing down click steps for her. Inevitably a week later and a month later she’d need me to show her again, because she refused to develop a mental model of what a “button” is, a “menu” is, an “icon” is, etc.

If by some chance an icon moved two spots in a menu, she was dumbfounded, frustrated at the audacity, and unable to complete her goal.


Eh. Yes, in theory. In practice, I have a bunch of procedures written for people, for whom I have to explain basic filtering functionality in Excel. So even relatively minor change in UI or even underlying concept is a big deal. Just talking about it gives me a Vietnam flashback to the time we had to update all procedures at once and the ridiculous granularity with which I had to describe each step.

I hate to say it, but not everyone is ready to re-learn every single things from scratch. I can, but I avoid whenever I can. If the previous approach works, don't change it unless there is a good reason to.

Will someone please think of the users.


Was continuing to use the older version of Excel not an option?


I took GP's meaning as "these people need to have basic Excel filtering explained to them, so how can they be expected to deal with various UI/UX changes?"

I don't know the answer: at some point a person needs to take responsibility for knowing the tools of their trade. Or a person hiring them needs to take responsibility for hiring people who know the tools of their trade.

Either these tools are mission critical and therefore employees should understand how/why they work, or the tools are not mission critical and therefore it's not a big deal when their interfaces change.

There's a middle ground there where tools shouldn't change so fast and when they do it should be well documented.


Sorry. I did not make it clear. I used Excel as a way to indicate avg. user ability. The system in this case was provided by a different vendor and, in that particular case, drastically changed UI and how the system behaved.

My point was, you don't spring changes like that with, seemingly, no forethought.


Your elderly Aunt was a packer... someone who needs a step by step "click this spot" list of things to get it done. She doesn't have a mental map of how things fit together, she just wants here list of actions to follow. It's how she got through life.

Programmers are mappers, most people are somewhere in between, but the few packers I worked with would be lost in this ever changing sh*tshow being spewed forth by the valley these days. That's a lot of customers to throw away.


> isn’t for cloud-based software (i.e. software that changes without asking first)

Is it inconceivable that cloud-based software wouldn't change without asking first? My employer uses the wonderful Redmine for issue tracking and we haven't been surprised by changes because we control when to move to new versions.


Of course, these products / services could unlazy themselves and create and maintain guides themselves - the rate of change would be reduced a LOT if they had to update guides accordingly.


Yup. Almost all documentation, even documentation that is called "good", in software is pretty awful. Developers grow to accept it but give any layperson software documentation and try to get them to set up your tool/service. I bet they will quit 5 minutes in. Is that the standard we should hold documentation to? I don't know but I bet your user/customer count would go up if it was a lot easier to get started.


I agree the onus should be on makers to provide docs or a UX that guides the user.

Would a tool that provides detailed and up-to-date instructions sell better and retain markedly more customers than one that doesn't?


Think it's important to distinguish between tech changes and UI changes here.

The former is somewhat inevitable I think, the later is often an unnecessary irritation.

Some UI redesigns are good & necessary but many seem to be just for the sake of it. A bit like if you're paying UX designers they're never going to say "yep it is good as is" since that would raise questions about why a company is paying X thousands to people not doing anything. So you get this endless stream of reshuffling existing stuff with no real value add and often a negative from the confusion. Which ironically is a pretty horrible user experience.

See also icon changes on phones. There too the changes are breaking a lot of the user experience (quickly finding what you're looking for based on familiar icons) for the sake of well not much value add. Every icon looking like a rainbow definitely didn't improve my life.


I have no idea of what the latest 6 versions of iOS brought of features. I likely use none of them.

I don't even have an idea of where to go to find out. So much is unknown gestures, some activates by accident when I want to do something else.

Textediting in Notes is difficult after they removed the looking glass. I really want it back.


Just read the changelog!

(To be fair the iOS changelog is reasonable, full of bullshit marketing signs like Random Capital Letters for Normal midsentence Words, but does actually give an idea of what's changing.

Compare to the typical app message

"We've fixed some bugs"

"Bug fixes, performance improvements and more cat videos"

Or worse, the ones that are adverts

"Just like our pizzas, we're always trying to make our app as tasty as possible"


I think the point is that if the last 6 OS changes didnt do anything substantial then why would one bother to read the change log? Maybe you do the first few times but after 5 meaningless, to you, updates why continue? And if the updates are not doing anything helpful for you, at least as far as you can tell, why are they happening?

I understand there are plenty of things that one doesnt seem immediate value from as an end user that are important but update fatigue is real and just read the change-log doesn't really fix that.


I believe gamepad support was added in iOS 13, which I thought would have been added much earlier. That's the only feature I can recall in as many versions. I mostly miss touch ID.


I blame "continuous integration" personally. I do a lot of consulting and this has been one of the bread and butter money makers that people ask us to implement for their teams. Devs love it, PM's love it, stakeholders love it. I am not so sure end users love it. Every time you log into the tool you have no confidence that its the same tool it was yesterday. I miss version releases personally.


I don't think CI/CD has anything to do with product features getting added so rapidly.

If there's an app were through CI/CD you aren't adding new features but fixing usability bugs and security holes, would you still feel the same about CI?


I'd argue that continuous integration is great, but continuous deployment might not be. Real customers don't need to be aware of the continuous integration, so it just keeps things (mostly) working without disrupting users. Whereas continuous deployment (to production) means they get surprise changes all the time and can't schedule things around disruption.


Reiterating my point: it depends on _what_ you're deploying.

You're conflating a product management problem into an ops problem


I suppose I'm not young any more, but I'm not all that old (late 30's). I tend to agree with you. Whether one likes or dislikes the changes, it makes investing in writing documentation (or recording training videos, which is even more expensive) impractical. This in turn makes the half-life of knowledge investments on the part of users shorter (independent of the age of the learner). I don't think it's just you.

Well-defined UIs can help by making discoverability of features simpler, but more often than not, but I'm not convinced the industry understands the difference between "the UI is easy to navigate and discover" and "the UI has soft colors and rounded corners on everything."


Stuff like this makes me appreciate that most of the software I use manages to innovate without throwing out the UX every 6 months. I use Ableton Live and Reaper a lot. Both look more or less like they did at launch 20 and 15 years ago. You could follow a manual for the first version and only be a little lost with newer features since they all follow the same UX conventions.

I had to crack open Google Analytics to grab the tag to paste into something. It took me 2 minutes to find it. WordPress is more or less the same if you go to /wp-admin/, but the infiltration of the new UX into it tells me that's on the way out.


What I worry about is how non-technical people will deal with this for just getting basic life tasks done. My parents are in their late 50s and despite my dad being a software dev for forever, he struggles with some newer technology and the fatigue of constantly learning new systems has caught up with him. My mom relies on us to teach her basic functionality (usually me slugging through deep settings screens).

As these more complex technologies become more ingrained in our lives, people who aren't able to keep up - that is most people especially as they age, will fall behind.


There is a meme, “haters gonna hate”.

In this case, I think the same principle holds: “software developers gonna develop”!

Meaning that I feel businesses would prefer to see developers working as hard as possible and just assume their output is always an improvement over the current product.

At the level of competing businesses 90% of effort goes to waste due to failing software businesses, as the winner takes all.

At the internal business and team-of-employees level, if time was spent it is assumed that it is always useful and pushed to the end users (instead of allowing the environment to kill the worst 90%).


“software developers gonna develop” that is the sole reason for their employment. I am a developer and I am terrified if I see the number of tickets declining in the backlog. There is always this question .. what am I gonna do after 2 sprints from now!


Same age as you approximately.

The killer problem is velocity is the only metric that matters these days to the whipmasters.

We sacrifice quality for this and that’s what’s really hurting us now. No one is seeing this because we’ve forgotten that the status quo of continuous toil is not normal.

20 years ago I’d be shot for doing things that are normal now.


Add to all of that, what seems like an increasing turnover in employment. We used to be able to invest in training our customers to use our product and then get years and years of smooth sailing. These days we seem to be doing a lot of handholding noobs.


That's exactly why I quit my job to start working full time on GainKnowHow.com . All the new hires during the pandemic were so disadvantaged because they just had no good way of figuring out all the idiosyncratic processes at the company.


A related question for you OP -- I'm at a similar age, with a dev background, but have reached the point where my accrued experience does not allow me to dutifully perform as a senior developer anymore. The Big Picture beckons, biz dev / solution design / architecture all overlap in my responsibilities, and while work remains interesting, it seems that somehow my increased ability narrows my options. I've considered the PM route, but the horror stories hold me back. How do you find this "knitting together" working out in the long term?

I'm with you on the "the damn ground is moving too fast" sentiment. My back-end of choice is Erlang-based, because in relative terms it's grounded in immovable bedrock, they fixed all the bugs decades ago and Things Are Not Changing. All my FE libraries are deliberately chosen versions and there is no ever-churning build chain to trigger a cascade of breaking updates. While this approach has worked for me well so far, I can tell not that many others appreciate it. How do you teach the value of reliable constancy to people enamoured with relentless change and tools for tools' sake?


> How do you teach the value of reliable constancy to people enamoured with relentless change and tools for tools' sake?

Can you start a business and hire people to think like you?


At our age (50) I think it's time to trim the fat where you can. Drop all the bloated software and hardware that you don't need. You may not have control over your workplace but you can simplify your own digital life.

I have recently been dropping software, subscriptions, containers and hardware like a good one. I have realised a lot of this stuff is just a giant waste of time. And time is more precious than ever now.


You’re not alone and your not too old. I’m 31 and I’m getting pretty tired of things changing all of the time.

So the other day I set up an instance of phpBB for my own private use. A forum for just one person? Yes! I’m experimenting with using a completely vanilla install of phpBB without plugins or anything (did change the theme to one that I liked better than the default theme though), to organize my own notes because:

1. The bulletin board model of different forum categories for different topics make things organizable while not going overboard with the organizing.

2. The bump mechanism makes keeping track of current concerns easy by posting in the respective threads. Old stuff on little interes naturally fade into the background without any kind of manual effort.

3. It is searchable.

4. It’s open source and self-hosted, and it’s been battle tested for years. It’s far from perfect but it’s stable and I could probably run it for ever.

5. I’m making it available over my VPN only. I can use it in the browser of any of my own devices as these are all part of my personal WireGuard VPN. Meanwhile, because it’s not reachable from the wider net my install of phpBB is not gonna get trivially pwned.


To quote a spiritual/philosophic answer from an Islamic perspective, there's a hadith where the Prophet (peace and blessings be upon him) said: “The Hour will not begin until time passes quickly, so a year will be like a month, and a month will be like a week, and a week will be like a day, and a day will be like an hour, and an hour will be like the burning of a braid of palm leaves"


Agreed. We now live in an environment of "just throw another feature at it". It doesn't matter that the feature is not really wanted; or that it screws up the system; and that nothing is properly documented - just throw more features at it.

The result - systems that used to be workable have become increasingly cumbersome and impractical. Don't add features, don't try to do more things - just do one thing and do it well.

And documentation!! I know that good doco is difficult (very difficult), but good doco is now completely unknown. Application doco now consists of a mindless detailed description about how to select a specific menu entry, or to enter data into named fields, with no explanation of the effects of those fields.

System doco is just as bad - almost exclusively automatically generated from function definitions. Any programmer can read the definition, can decode the type, so it's pointless duplicating that - just tell us the subtle details that are not clear from the defs or enumerated types.


I'm 57, and have been getting paid to code since 17. The pace of apparent change is high, but actual material change is practically zero. It's just chair swapping to do the same tasks, perhaps with tracking and graphs of that tracking added. My solution is to ignore all of it, unless a change materially affects my goals.


Often once an industry becomes brain-dead, change is made simply to have something to peddle. MS Office has been like this for decades: they simply change the UI for this appearance sake. Most changes are nothing-burgers or dubious at best. And it doesn't mean any new value is resulting. This is often when it's becomes time to look for substitutes that actually deliver the value you need.

If you don't see the value, as the customer, you are probably 100% correct in your assessment - everyone else simply is too afraid to mention "the emperor has no clothes" or isn't aware enough of what the big-picture goals and processes need to be to see that the tool is becoming Epic Fail. In general, most dot com 1.0 and 2.0 business models NEVER had the scaling to support sustained growth for more than a decade or two. MOST SHOULD GO OUT OF BUSINESS JUST ABOUT NOW.


I'm not against it, but I feel this way about crypto and Web3.

I feel everything is changing too fast and I'm struggling to catch up. I support it and agree with it, it's just so weird to see billboards with NFTs (which I do consider scams) and Ethereum projects spreading so fast.


> And: we do a bunch of work with Wordpress. The rate of change here is insane, too - every single update brings new features, none of which is documented, bedded in or understood. None of which can be written about, supported or workshopped.

> And: Trello. It was fine. And then Atlassian bought it and it became this horrific behemoth of "features", all of which just clutter everything up, none of which seems to actually do anything useful.

Everything has to show hyper-growth all the time. A mature product or software package that works is actually bad because it's not showing a rocket ship growth trajectory. So anything mature has to be fiddled with endlessly in an attempt to squeeze more growth out of it.


Maybe software devs are like peacocks, flaunting their colorful feathers to catch a mate (customer). And now the peacocks have found a way to change their pattern every week.

Or maybe this is hell and we are being punished for sins in our past lives.


It's even getting into my toothbrush. I started using a smart toothbrush with an app to guide me, and one night the whole pattern changed.

I don't want to re-learn how to brush my teeth right when I'm about to go to bed!


Is this satire? Even as a millennial, I cannot fathom buying a smart toothbrush that presumably relies on bluetooth--and for what, a timer? I'm hoping this is a joke :-).


> Is this satire?

No. It's the fancy Sonicare. I'm the kind of person whose brain is shut off when I'm going to bed, and given that I have dental issues, something that coaches me in good brushing is well worth it.

The toothbrush is great. The app needs a lot of improvement. A good friend who is a dentist just uses the toothbrush without the app.

FWIW: For years I used the $10 Sonicare that uses AA batteries. Turns out the motor is different and the vibrations are different. (My periodontist can point me to actual academic studies comparing the two toothbrushes.) I can't wait for the patents to run out because a toothbrush like this shouldn't be ultra expensive and rely on an app written by amateurs.


I think developers miss the underlying message of "move fast and breaks things", and take it a bit literally.


Is it just describing the top down approach basically: senior leadership has a vision, directors come up with strategy and have their direct reports execute on it, POs and PMs trying to do their jobs take the orders from above and come up with features and requirements, designers design it, BAs make user stories for backlog, developers develop it, KPIS and OKRs met, bonuses and promotions on track all around… product becomes bloated and users suffer from the breakneck speed of change under the guise of “innovation”?

Maybe I’m jaded from being in tech too long?


I’ve found that my desire to needlessly change my website comes from being bored and not having an outlet for my creativity so I invent new problems to solve.

My most effective way to combat this is coming up with creative dev projects to fill the void. A bouquet website, a website about local parks, a website that lets you browse campsites by following the trails like a MUD.

I used to feel guilty for not working my my main business 100%, but I have both saved my users from being annoyed as well as picked up knowledge I can use to improve my main site when the need arises.


I think there's a threshold to how much change one can witness and handle.

At first you're excited about change because the status quo seem clunky to you. You thrive in change as long as you can, and then one day you start questioning if certain changes are necessary or not. And at some point you find yourself rejecting change, holding on to your favorites seem like the sane approach. But what you're holding onto is now clunky and outdated for the new guy and they are really excited to change things. And so the cycle continues.


That seems reasonable, but combine that change threshold/limit with CI/CD and you've created a burnout factory it seems.


I'm in my early 40s and sometimes feel like I'm in my late 60s fumbling through idiotic interfaces and not making any sense of it all. No offense to anyone in their late 60s, based on my cognitive decline I approximate where'd be in my late 60s if I make it that far. The intuition I have accrued over the decades of computing is becoming not only useless but a hinderance. I know what things should have been and the frustration of re-learning the new version du jour of a new interface slow me down tremendously.


I'm an older software dev (early 40s) I feel as an industry lately that because of (Insert Noun) As a service businesses quality can't be directly measured. Because everything is in a web browser quality for a Product Manager isn't measured by ROR or Rev but in number of features delivered. I feel people are inherently selfish and prideful and since everyone wants to leave their mark we get a product like Teams, Excel, Google Analytics etc they have changed beyond recognition.


The rate of change would be fine in my book.

The problem is the rate of breakage. Backwards compatibility is a forgotten art. Code is endlessly rewritten and refactored. Sometimes for trivial things like changing names. And of course, they don't keep the old name for more than a year, if at all.

Can't anything ever be good enough? Obviously things move on and new tech replaces old, but we don't need to break compatibility between versions of the same code for no reason.


I'm a bit younger but have noticed the same thing even on a 10 year horizon. I think a lot of it, especially with SaaS, comes to a product manager's inability to say "We don't need to build more shit" in the current tech startup environment. I don't think it's egotism on the PM's part, it's like the whole system demands we keep shipping and "improving" things and never reach an equilibrium.


I think this is mostly related to feature-creep and the feeling that developers should do something, even though the product might be "done" (considering that software never is quite finished, reading done as "problem solved"). I'm working on a web analytics tool that helps "normal" folks to understand what's going on [0], and users are already starting to demand more than the tool was supposed to be. It's really tempting to keep changing stuff for the sake of feeling productive, instead of improving what's there and making it as reliable and stable as possible. GA is meant to fit all, but it fails on simplicity (alongside privacy and other Google typical problems).

This doesn't mean innovation should stop, but I think a lot of software doesn't need to change this rapidly to stay relevant. Docker is another example that struggles to find a monetization model because they basically solved the problem with the first version, and added tons of irrelevant features and services around it.

I'm 28 btw, having the same feeling.

[0] https://pirsch.io


Attempt at the “why is everything changing too fast?”: When it is so easy to add new features, and get feedback to fix problems later (Early Access games on Steam, for example), why not? We’ve conditions ourselves to consuming the flood of novelty, so my guess is that this inclination corrupts our Making as well. Why bother editing my book when the publisher wants 300 pages, even though I can tell the story in 100?

Gripe: the Discord app feels too invasive when all I want to do is talk with my friends and play a game together, but it’s what works for us, and with so many people using it, that’s a lot of valuable data to gather in the interest of making the service even better, ideally staying just above the annoyance threshold of each subgroup enough to keep users hooked. The audio’s great, so I just use it in a webapp, thus also avoiding having to open a browser to update Discord on Linux when the auto-updater can’t do it.

Flood of novelty: I still get the little vestiges of a thrill when I update software. If nothing looks different, were there any changes made? Is this part of why we’re seeing so much change, along with trying to please everyone?


There is a theory I believe that its most productive to have tools that do evolve significantly over time, but not continuously. You need months/years of stability in a tool to a) learn its ins and outs and b) get value by using those learnings for a significant period of time. If the tool its constantly changing, even if becoming better, you never have time to leverage your new skills in the tool.


I think this is a major outcome of leasing everything you use. Everything used in the process of developing software can be SaaS based - work tracking, source code repository, deployment tools, code analysis tools, the surrounding tools if you deploy to the cloud, even code editors these days are pushing to be SaaS based.

This is akin to a software based dashboards in a cars with no buttons, dials or knobs to use and just a touch screen. You potentially have to re-acclimate yourself with how to use something much more often.

It would be good to understand what is driving this but maybe its really just an era of change for the sake of change (aka Resume Driven Development also for product mgrs, etc.) I often have to reign in developers that just want to re-write things - its fun to do a wholesale re-write but very rarely necessary and with not much actual business benefit.

I wonder if this means there are opportunities for startups to build installable software and/or build products in a way that "locks" a UI/UX for a period of time as a benefit - it would make their product stand out.


I feel the same way and am not even 40. It's too much swirl for BS "features". It's like there's a rush to deliver deliver deliver without actually looking at client impact and value.

On a related note, I'm tired of filling "stretch" roles (more than one role at a time) and being required to work in multiple stacks. Just let me be full stack in only one stack, please.


Another thread points out that the contents on a Halo disc being sold today isn’t even a functioning game.

Back in the day when you shipped a game, you sure as %^*# made sure it worked because you can’t patch it easily or at all.

I think this contributes to it. Nothing is ever complete nor needs to be designed to be complete because the cost of modifying shipped software is lower than ever.


According to Peter Thiel the rate of change isn't really much different than before. What IS different is that the world of atoms(e.g. outside the internet) is getting more and more regulated so all innovation and change gets funneled into the Internet which still remains largely unregulated and I think he's correct on that one.


> ...Is this rate of change supportable?

This is hard to project, as there is no definitive target, most of the changes that you describe are more like operational in their nature. Basically, it's an equivalent of planned obsolescence.

Most of people can still operate hand screwdrivers, and have the job done, yet their preferred tool now may be powered.

There are benefits, of course, and most don't bother noticing these transitions, as they may be occupied with higher level of problem solving.

Another factor is that in software there's no really "ideal" form. Same material ideas can be shuffled ad infinitum, as in collidoscope, creating different feature combinations and looks. Perhaps, you did get to see quite a few and no longer feel excited with the rolls.

But even for younger folks there have been many transitions in short time, like the fast pace of smartphone changes just in a less than 10 years. Maybe younger minds are more adaptable to quick changes.


There are changes that occur because your customers want new/different features. Just become some customers don't want/need them doesn't mean that most customer do or at least the ones who pay the most money.

Also, the default for most tech companies is that you become obsolete by default so things don't always get changed "for the sake of it" but because if you don't, before you know it, your competitors all look much fresher than you do and you have 5 years of work to catch up.

Another issue is related to direct competition with competitors. Your customers are less likely to be loyal because SaaS provides quick and easy onboarding to another solution if the current one doesn't cut it. We have plenty of customers complaining that we don't have feature X and they will leave if we don't implement it. So we do!


The people who buy the tools only consider cost and feature lists (or maybe Google Analytics is really useful for them personally, everyone else be damned). The people who make the tools only consider their dev partners and the people buying the tools.

Users, like you and me, are not considered anywhere along the way. As a result, all of our job descriptions now including managing this ever changing infrastructure of red tape on top of everything else we've got to do. A little bit of this is OK and expected, but it sounds you're dealing with more than a little bit of it all at once, to the point where it's not sustainable.

If we're in that position, it's up to us to have the confidence to say to our manager, "OK, I can do this, but it's going to take (this much) longer because I have to deal with this other stuff now, too."


For me, an issue is that the most popular frameworks are created by huge companies who primarily rely on ads and tracking data, so everything requires massive/complex server-side resources, which are too expensive for someone like me who just wants to make free and open source web apps. I was hopeful about the 'unhosted' movement 10 years ago, but it kind of fizzled, as seemingly have related projects like pouchdb and solid. I'm hopeful about recent posts using sqlite in the browser and w3's storage foundation, but they are still in development, not really ready for prime time anytime soon. pwa's and the like seem to not be in much favor anymore, either.

Browser vendors (controlled by the same huge companies) blocking websql and poorly supporting things like pwa are other examples of this issue.


Fundamentally, the reason for why everything is changing too fast is that people have more capacity to create change than to absorb it.

Specifically, some people have that capacity and also do not care about whether other people can keep up absorbing it.

Another fundamental reason is that creating change is profitable. Corporations are penalised for making stable environments by competition that creates new things, faster.

Next level from that is corporations focusing on rate of change (that's what Agile is for, folks). Now corporations are less competing on their product and more just on how fast they can react to whatever competition is doing.

You see where all this leads. We users are all just a collateral in the struggle.

I would prefer more stable operating system than couple of refreshments on Windows or MacOS UI. But that's not how Microsoft or Apple think.


When asked to improve a system, people tend to add to it rather than remove. There is a research article explaining this beautifully but I lot track of it and couldn't find it. (someone please point me to it if they know what I am talking about).

So, when we have these systems/products and the constant need to improve/upgrade we are endlessly adding more things to it. There are multiple motivtations here, one is by adding more features they can charge more money. Or they are looking to integrate into existing products or cover new markets. In the end the users are going to suffer the cognitive load.

I don't think its' the age or the person. It's the tools which are becoming overly feature rich or I would say, feature greedy.


I updated to ios15 on my iphone and now the safari address bar is on the bottom. No big deal, but why? Someone at apple had to justify their job? Please, if you feel the need to pointlessly move UI components, consider contributing to an open source project instead.


The world can be surprisingly misaligned with logic or efficiency. Feature bloat and UI reshuffling are bad things. They make the product more complicated for its users and distract from the core product-market fit that makes products successful in the first place. But the job of software teams is to ship. Once the product it in a good place, the shipping doesn’t stop. It would take a strong leader to say, “enough is enough, stop bloody shipping!” I think leadership with the quality and confidence to do that is rare, more likely leaders live under fear that someone else is going to unseat them, another product, another person, and so, feeling like the bear is always chasing them, never stop running.


I'm also very tired of the increasing rate of change. I know a lot of people here will be familiar with the Red Queen Hypothesis [0]; although the term initially comes from evolutionary biology, I think it's applicable in any situation where there is a lot of competition. And then most if not all the tech companies, which are aware of this effect even if they don't give it a name, are constantly trying to "innovate" and find the New Hot Thing before the competition does. And this makes said companies try to go faster than they would go if they weren't so afraid.

I don't know if it's just me, but I'm seeing a lot of urge to change everything even in more conservative big enterprises, which used to be the safe space of sorts for people who wanted to build products in stable technologies, maybe not so hot but still really solid and well-functioning. I can understand if enterprise desktop applications are seen as obsolete and everything has to be web now (to me it still sucks, though, both as a developer and as a user. I assume I'm obsolete as well), but since a few years ago, now everything has to be Everything as a service, and machine learning must be shoehorned in some way, no matter whether it's well suited to the problem at hand or not. It's being so bad that higher management starts demanding rewrites of 100K-1MLOC applications in new languages, with every component of the full stack being new, because trusted technologies are apparently not evolving fast enough.

As I've said above, I assume that I'm just obsolete. But I'm pissed off at the current state of the industry, not only because I have to switch jobs for my mental stability, but also because I used to be at least partially in sync with many aspects of the industry, but that has stopped in the last 5 years or so. I don't see myself being a developer for too long... it's too much bullshit and I have non-software skills that I can put to good use in a job that doesn't make me insane, even if the pay is lower.

[0] https://en.wikipedia.org/wiki/Red_Queen_hypothesis


> It's being so bad that higher management starts demanding rewrites of 100K-1MLOC applications in new languages, with every component of the full stack being new, because trusted technologies are apparently not evolving fast enough.

At my boring enterprise company, the management sort-of insists on newest technologies because they see it as a recruiting tool. We're having a lot of troubles hiring people, and candidates ask about these shiny new toys (k8s, cloud etc.) during interviews, so management just wants to give it to them.


There need to be more co-op open source shops rather than corporate behemoths that don't care about UX, consistency, or users.

Churn is pointless changes. Changes should improve things, not provide job security, offer design novelty, or waste time or money.


At 55 I can relate. I'm wondering in the tech space whether we happen to be the first generation that has spent our whole professional careers in tech (me electronics first, then computers, then networked systems, then internet systems, with a bit of dev. sprinkled throughout) and thus after 40 years of constant change you (or one) just gets to certain stage? I know I'm done (at least professionally, I've retired), although I still tinker with tech daily but out of pure interest. The last prof. gig I had I spent a lot of time thinking why am I spending so much time re-tooling, re-fixing, re-inventing the wheel? :-)


A pernicious product management pattern is data-driven feature testing - every single possibility is A/B tested to death. Even if they get the cohorting done correctly so that you get a "stable" experience for that feature, the bar for changing the UX is much lower than it has ever been.

There's no direct metric capturing UX stability, and I've never seen a real attempt at measuring what rate of UX change is sustainable. Bundled together with the PM career pattern of "add one big feature that has a measurable impact, then get a new job," we end up with an unstable stew where nothing works like it did last week.


the "unbundling of excel" to SAAS has exposed us to profit motive moral hazard, it's the expansion phase we're in as VCs rollout www tech across all of knowledge work.

(PS this is the bullcase for a good low-code tool – saas cannot be trusted and is going to exploit us until programming gets simple enough to bundle all this back into a networked graph spreadsheet for precision data MVPs. So we can all just get our work done again. I'm a founder in this space: http://www.hyperfiddle.net)


The answer to your actual question, "why is everything changing too fast?", is "what is it you thought would stop it?" There are advantages to moving quicker than your competitors/peers (some substantive, some merely in appearance), and therefore there is a "Red Queen" affect making things go faster. What is supposed to slow it down? In the current system (either economic or technological), there is nothing that is intended to keep the pace of change from getting too fast. If there's nothing to prevent it, it will happen.


This makes me think about the value of Craigslist, being its crazy stable user interface.

Maybe products like Teamwork could actually "SELL" the "Feature" of not changing things. Like a promise the UI won't update for 5 years. Or you will always be able to use the "classic" interface, and only progressive enhance your environment as you choose.

I feel like Wordpress does a pretty good job with this. We don't usually have big changes forced on us, and some of our sites are running really old versions because people like it that way.


I disabled auto-update on my iPad because forced updates destroyed a couple dozen apps I used to use (e.g. app removed because it was abandoned and didn’t support some new api that apple deems a must, or the app went full on SAAS and just doesn’t work unless I subscribe to a monthly fee even if I was using it for a single purpose.)

On the scale of human civilization, software is a new born, will we have this level of change in 100 years, 1000? Doubtful. We are just waiting for the equilibrium to occur, and until then, enjoy the chaos!


There's an excess of funding for software development. Companies have to find ways to deploy the massive engineering/product budgets that they built up in the initial development stages.

Very few companies want to lay off engineers and product managers (understandably so - talent is hard enough to recruit and retain). So these departments just keep producing more and more features and updates.

Truth be told, the software industry is a lot more like the old-school corporate bureaucracies of the past than anyone cares to admit.


I think this is partly due to the "feature saturation" that happened in the 90's & early 2000's. As a knee jerk reaction to that, software became "lean" and very "do one thing & do it well" (Basecamp era).

IMO now we're going back to feature saturation. Notion for me is the perfect example - great piece of generalist software that doesn't do any 1 thing particularly well. New GitHub issues too & the new Trello features too. Jeez, even Basecamp nowadays.


Similar age, but I'm an academic economist at a research university. I've been dealing with change my entire career. I've had to teach myself Bayesian econometrics, machine learning, loads of computational tools, and on and on. It never ends. I haven't used anything I learned in grad school in an important way in a decade.

For me, the problem is that I don't have the time to learn that I used to. My job has so many other aspects that I no longer have the six-hour blocks to learn new stuff.


I started seeing this trend about a few years back. Whenever I start to use or make a choice for the team, one of the Questions I always ask are;

- “Can I walk out of this with my content/data and move elsewhere?”

- “Do I really need the data/content when I want to move out?”

And I try to stay as closer to the ones where I can move our fast without heartbreaks. There was a good gesture in the industry for OGs, "grandfathered" and could use the same thing for many many years to come. These days, they don't even care.


I'm personally only spending time on the peer-to-peer/distributed web space: DAT/Hypercore, SSB, Activity/Commons-Pub, Holochain, GUN, Solid, etc.

Why did a standard like RSS get expelled from client-server web platforms? It makes little to no sense. The knowledge pyramids that are growing every day are anti-learning, so we need to continue to push for a distributed p2p web.

The growing p2p space is an inspiring counterforce to the dominant venture capital client-server web.


favorite take is this 'your change probably isn't for the better' article from 2 years ago

https://gist.github.com/sleepyfox/a4d311ffcdc4fd908ec97d1c24...

G underestimates the cost of change to consumers, has a legit need to add features to their product, and no internal management ability to globally rate limit product changes.


software development process has improved a lot over years. Now it's not that hard to ship production ready application/feature in weeks rather than months or years. Product development teams are typically dedicated to one product at a time and their performance and to some extent their employment is measured on how much new software is shipped every sprint so they have to come up with something every 2 weeks.


I’m 29, have cofounder a startup, and I feel this way. The way I see it, product managers are the issue first. Every one wants to make a name for themselves. Second is that managerial leadership is so unacquainted with reality that they can’t see through the marketing, the short-circuited mental models and the simplifications anymore, and therefore can’t make reasonable decisions.


all of what you mention is why i moved away from cloud software several years ago and built my own versions of the tools i need for my business. they only have 10% of the features. but, that 10% is highly customized to my specific situation. it never changes and because i wrote the code i can change it when i need to, and i understand what i changed and why.


If you have an hour go watch this https://www.youtube.com/watch?v=FCWAvcsnIFg. I think you can have change and new features in a more reasonable way. Btw. my opinion is you are not crazy there is a lot of change for changes sake


It's "designers" who have no idea what they're doing or asking for and they get power because usually they're the most annoying people in the company, so people just let them do it to shut them up. It's almost a generally applied concept, add a "designer" and suddenly everything is trash.


Way more money to be made -> more developers get hired -> more features get shipped.

Also, faster dev machines & better tooling make iteration faster.

There's also a cultural shift from shipping mega features into small incremental changes which is good for developers (less merge conflicts) and QA (incremental limited testing)


To answer to OP, I think it's an inevitable consequence of the economic model behind the companies that make those products. That's why I'm trying to adopt FLOSS tools as much as possible (and also to try to learn how to contribute and to give back with both time and money).


My view is, a lot of change is change for the sake of change.

When you start, you learn what is there and it's all new and exciting. While at 50 you must also unlearn some of what you know, for seemingly little gain. That gets boring.

Do you remember back in the late 1990s, when you were just starting?

In 1998, Cusumano and Yoffie coined the term "Internet Time" to describe Netscape. Here are some quotes from http://edition.cnn.com/books/beginnings/9811/internet.time/ :

> The conventional wisdom about competition in the age of the Internet is that the business world has become incredibly fast and unpredictable, and we need to throw out the old rules of the game. ... After more than a year of intensive investigation, we are inclined to agree with some (but not all) of the hype. ...

> For us, competing on Internet time is about moving rapidly to new products and markets; becoming flexible in strategy, structure, and operations; and exploiting all points of leverage for competitive advantage. The Internet demands that firms identify emerging opportunities quickly and move with great speed to take advantage of them. Equally important, managers must be flexible enough to change direction, change their organization, and change their day-to-day operations. Finally, in an information world where too many competitive advantages can be fleeting and new entrants can easily challenge incumbents, companies must find sources of leverage that can endure, either by locking in customers or exploiting opponents' weaknesses in such a way that they cannot respond. In short, competing on Internet time requires quick movement, flexibility, and leverage vis-a-vis your competitors, an approach to competition that we define later in this chapter as "judo strategy."

Sound familiar?

This of course lead to a lot of use of the phrase in pop culture, and counter-arguments, like Demming's essay at https://dl.acm.org/doi/fullHtml/10.1145/504729.504742?casa_t...

> One of the most common buzzwords today is "Internet time." It describes the apparent increase of the pace of important events that we experience with the Internet. Developments that used to take years, it seems, now happen in days. Competitors pop up by surprise from nowhere; it is no longer possible to identify them all and monitor them. The now-widespread practice of email has simultaneously improved business communications and become a burden for many. Many IT practitioners, growing weary of spending two or three hours a day keeping up with the many dozen arriving email messages, complain of "information overload." Like most buzzwords, "Internet time" and "information overload" contain important seeds of truth while masking misconceptions that lead to ineffective actions.

> Andrew Odlyzko debunks a key aspect of Internet time - the notion that the Internet has sped up the pace of production and adoption of new technologies [3]. He offers example after example of new technologies that have taken just as long to diffuse as their predecessors in previous decades. He concludes that the most cited example, the Web browser, is the single exception to the rule. He claims that belief in the myth comes from a misreading of transient phenomena and from business hype.

Time to rewatch Koyaanisqatsi: Life Out of Balance. :)


I agree. A huge problem is that the incentives get misaligned. The bottom line isn't aligned with what's best for the customer.

I don't think a single feature on MacOS in the past 10 years has really impacted my day to day usage of the computer. And I don't think I'm alone in this.

I asked a group of friends "what is your favorite feature on MacOS in the past 10 years?" Most couldn't answer this question. One brought up "the retina display". (If you have an answer to this, please post as a reply.)

So decade later, my computer hardware is much faster, yet the OS is pretty much the same, but slower. And a bit prettier. And takes up 12 GB now.

I want performance and stability. I want apps to load fast and for the OS to get out of my way. I want good battery life. But if Apple focused on this OS faster and more stable, I suppose it wouldn't sell more computers.

I sometimes wonder what would our end-user experience be like if all the MacOS team only focused on was performance, stability, and polishing the UX for 10 years.


Quantum changes: abrupt changes in levels instead of a smooth rate of change.

Another example is a earthquake. Stresses build up slowly but nothing happens for a long time. Then all of a sudden, things snap into a new position and the whole tension cycle starts again.


I see that when using MSFT teams. The loops of calls to MSFT account endpoint and what not. The inability of allowing multiple accounts/domains. Just because the software has components which are not bloatware, the clutter is there on arrival.


I've started building simple versions of google tools like Contacts on my own server. They load fast, are non-dynamic, and it's satisfying as hell :) It feels like a hedge against the way I see a lot of the megalith software going.


I think this could be a good argument for more open source software. Developers are volunteers or have limited time from their jobs to make changes so they tend not to break things that work.

For profit companies and employees have different incentives.


This aspect has kept me in business making bespoke web systems for clients with that underlying promise that things will, wherever possible, stay the same, it has been the deciding aspect for many of the teams I worked with.


Key driver — need for profits.

Constantly borrowing from the future is something people got used to.

Secondary driver — need for reinventing the wheel with each new generation (people want to be proud of their own achievements, not bowing to the fathers).


I don't think anything is changing faster than before. Instead the issues come from relying on more products than ever and those products being hosted externally which means there is no way to avoid the changes.


because we can.

(and don't (want to) think about consequences)

(("we" as humanity. As this trend is not only software, every thing and your pencil is pushed into a new version, for the sake of it. OverTheAir, OutSideYourControl, Etc))


Limit dependence on proprietary “tools.”

The basics haven’t changed much, and can do most things.


The commercial tools are changing out of necessity. If they stop changing they die. I won't go deeper into this.

The open source power tools are rock solid for years and years. I have a setup 100% cli based that hasn't changed much in 10 years: mutt, offline-imap, notmuch-main, vim with plugins for what you need (wiki, calendar, todo, etc), git, python, etc. Everything in i3 in Linux.

I think I spent hundreds of hours refining this setup. I am sure somebody will give me the comic strip from xkcd for this BUT: I am happy with my setup, I love tinkering with it and optimizing it. And my results at work reflect this, I never have to search for a file, an email, etc during a presentation or anything like that.

While other people have their Windows desktop covered with folders and then some and sweat a lot fumbling on their work flows. Meaning they have failed by their standards. It's the result that matters, remember that. If you're not happy with the outcome you must be doing something wrong.


> The commercial tools are changing out of necessity. If they stop changing they die.

Change by itself is neither good nor bad. But change for the sake of change is rarely good.

Evernote is a classic example for me. Sometime around 2014, it was perfect for me. The Windows application was small, fast, and easy to use. The web clipper solved a real problem for me and the mobile applications were good enough. I was on the $35 / year plan (and still am) which felt like a decent value. I loved hearing Phil Libin talk about building a company that will be around in 100 years. It made me believe they would be conservative and value stability and it was exactly what I wanted to hear.

Today I'm working out how to move everything from Evernote to either Apple Notes or Obsidian. Evernote's Windows client got fat and slow and feels terrible to use. It's a bummer for me because I used to love the product.


It is 2021 and you still cannot find an integrated software package to run a business, yet specialty software products keep adding features that make their software less universal.


Lucky you aren't a Javascript developer.

Makes me love the command line.


I think we should consider having the role of a “user advocate” in all product decisions so users can push back against harmful or confusing changes.


But how can a user advocate stand up against the shareholders, the people with the money, the ones that have the companies by the balls and demand the metrics to go up or else heads will roll?

I mean. The founders and employees of those companies are incentivized to go along with it, because numbers go up means they will end up better off as well.


Isn't that the product owner's job?


Product managers should be rewarded on keeping the rate of change low for user interfaces. It does not seem like that is the case currently.


ugh I miss when WP was a quiet, unexciting project that mostly got security updates, I dread the day when they finally make me switch to the Gutenberg editor. My site uses themes I built a decade ago that don't know a damn thing about blocks and no interest in teaching them about those or in blowing a month on redesigning the whole fucking site around the new ways.


Side topic. What's wrong being a software dev?


IMO- it's that it's not a job but a skill. And that skill is meant to do a job with it, not be a job itself.

Programming ability today is like a modern equivalent of literacy or numeracy, but it's being abstractly used by the illiterate and innumerate to accomplish a job they want done but can't do themselves.

Which is how we end up with products developed by people who don't eat their own dog food. It's not necessarily their fault, it's the nature of the beast.


> Is this rate of change supportable?

It's not and it severely hurting all production environments that need stability and security.


The .net core release cycle seems to be this.


.NET framework was stable for so long, as soon as they open sourced it, .NET 6 looks nothing like .NET core 2-3, wth...


To be fair .NET 6 does look a lot like .NET Framework (4) so really its a return to the normal.


I've actually stopped paying attention as I'm not coding as much these days. That will change soon. Is .net a return to stability?


The monetization models are broken, which is causing otherwise good tools to start trying to evolve into other things.


sounds like fluid vs crystalized intelligence, see https://www.betterhelp.com/advice/general/what-is-crystalliz... for more info


google analytics is not intended for you to achieve any meaningful success, it exists to collect data from your properties and sell you ads. the best analytics are those you build yourself or perhaps some open source package. that stuff won’t change overnight.


I used to use Trello a lot... Now I'm sad and without my favorite task manager.


Same. Now I organize my Trello projects in a series of markdown files and subdirectories in my home directory. Went from being a paid user of a service to just making text files because of how badly they fucked-up that product.


It's just that tech is massively moving in on quite fast direction.


From where I sit, things don't change that much. Innovation is lower than it once was, large structural changes follow the dictates of capitalism alone, which generally favors profits above reinvestment and improvement.

From a narrow viewpoint, perhaps coding is blossoming, it's seed spread to the wind to create flowers across a whole field. Or is it a mirror, where once it was a plane, we now have shattered hundreds of tech pieces and distributed them.

Yet, broadly speaking, things need to change faster if we're to mitigate climate change, disinformation, corruption, and all the threats of 21st century life. I will happily rewrite documentation and relearn methods if they're shown to be safer, more efficient, and more expressive.


Software has now digested the world and turned it to...


Why Software has Indigestion - Marc Andreessen


I’m mid-30s and have also noticed this horrible trend.


It's a sign the field is running out of ideas.


"All that is solid melts into air."


Open your arms and hug the changes :)


this is why you use emacs instead


Or vim. come on now. ;)


Market cap has to grow somehow. Capitalism as economical model depends on growth. So they have to come up with ridiculous things like hedonic index and such to create an illusion of added value out of nothing.


Our financial system rewards growth. There is little incentive to keep good tools static. It goes beyond the tug of capital, even GDP increases with such churn, whether the work is useful or not.


because expecting things to not change is SELFISH

life evolves, and we do too, our habits too, our needs too, we need to adapt to the environment, we need to adapt to weather, we need to adapt to scarcity, we need to adapt to people's mood

you are selfish if you expect things to just stop, and if you don't adapt, nature told us that you'll go extinct


Lack of attention span. It's the Google syndrome.


No it’s fine. You’re just getting and tired of the churn. Most changes are good and welcome imho (30 y/o software dev tho, not a PM [thank god]).


I found the source of the problem.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: