Hacker News new | past | comments | ask | show | jobs | submit | arkh's comments login

> GLP-1s also appear to quell addictive behavior around alcohol and hard drugs

Being on D2 agonist which has the exact opposite effect I wonder what taking both would result in.


The result of Agile (and DDD, TDD etc.) comes back to The Mythical Man Month: you're gonna throw one away. So plan the system for change.

And due to Conway's law: plan the organization for change.

From those ideas you derive Agile (make an organization easily changeable) and the tactical part of DDD (all the code architecture meant to be often and easily refactored).


> Or is the only solution to fix loading times in the first place?

Ding! Ding! Ding! We got a winner!

Yeah, maybe we could expect machines which got 40 years of Moore's law to give you an experience at least as snappy as what you got on DOS apps.


Yes I am baffled how modern apps are painfully slow. Everything seems to include a Chrome Embedded Framework and therefore have an entire browser running. There is sadly a generation of people who grew up after .net was introduced who think it's perfectly reasonable for a VM spooling up as part of an app load or loading a browser is fine too, and have no idea how speedy Windows 95 used to be, or how loading an app took less than 1 second, or how easy Delphi apps were to create.

It's honestly very sad.


It is really amazing how big gui flagship apps like office suite or adobe suite seem slower than they did in 2001. And really they don’t do anything different than these old tools maybe a few extra functions (like content aware fill in ps) but the bread and butter is largely the same. So why is it so slow?

It is almost like they realized users are happy to wait 30-60 seconds for an app to open in 2001 and kept that expectation even as the task remained the same and computers got an order of magnitude more powerful in that time.


Let's not go too far with the rose tinted glasses. Win95 apps are speedy if you run them on modern hardware but at the time they were all dog slow because the average Win95 machine was swapping like crazy.

Loading apps on it definitely did not take one second. The prevalence of splash screens was a testament to that. Practically every app had one whereas today they're rare. Even browsers had multi-second splash screens back then. Microsoft was frequently suspected to be cheating because their apps started so fast you could only see the splash for a second or two, and nobody could work out how they did it. In reality they had written a custom linker that minimized the number of disk seeks required, and everything was disk seek constrained so that made a huge difference.

Delphi apps were easier to create than if you used Visual C++/MFC but compared to modern tooling it wasn't that good. I say that as a someone who grew up with Delphi. Things have got better. In particular they got a lot more reliable. Software back then crashed all the time.


I suppose you are right. I worked with MFC/C++ and COM and it was horrible. Delphi and C++ Builder things were nicer to use but fell by the wayside, particularly after Borland lost their focus and didn't bother supporting VCL correctly in themes and also had major issues with their C++ compiler. They suffered a brain drain.

I remember Windows Explorer opening swiftly back in the day (fileman even faster - https://github.com/microsoft/winfile now archived sadly) and today's Explorer experience drives me insane as to how slow it is. I have even disabled most linked-in menu items as the evaluation of these makes Explorer take even longer to load; I don't see why it can't be less than 1 second.

Anyway, I do recall Netscape taking a long time to load but then I did only have a 486 DX2 66MHz and 64MB of RAM.... The disk churning did take a long time, now you remind me...

I think using wxWidgets on Windows and Mac was quite nice when I did that (with wxFormBuilder); C++ development on Windows using native toolkits is foreign to me today as it all looks a mess from Microsoft unless I have misunderstood.

In any case, I can't see why programs are so sluggish and slow these days. I don't understand why colossal JS toolkits are needed for websites and why the average website size has grown significantly. It's like people have forgotten how to write good speedy software.


Well, today I spent a lot of time waiting for some slow software that I wrote and maintain, a program that helps ship large desktop apps that use JVM or Electron. It can do native apps too but nearly nobody writes those. So I guess I both feel and create your pain in several directions.

Why is my software slow? Partly because the task is inherently intensive. Partly because I use an old Intel MacBook that throttles itself like crazy after too much CPU load is applied. Partly because I'm testing on Windows which has very slow file I/O and so any app that does lots of file I/O suffers. And partly because it's a JVM app which doesn't use any startup time optimizations.

But mostly it's because nobody seems to care. People complain to me about various things, performance isn't one of them. Why don't I fix it anyway? Because my time is super limited and there's always a higher priority. Bug fixes come first, features second, optimizations last. They just don't matter. Also: optimizations that increase the risk of bugs are a bad idea, because people forgive poor performance but not bugs, so even doing an optimization at all can be a bad idea.

Over the years hardware gave us much better performance and we chose to spend all of it on things like features, reducing the bug count, upping complexity (especially visual), simplifying deployment, portability, running software over the network, thin laptops and other nice things that we didn't have on Windows 98. Maybe AI will reduce the cost of software enough that performance becomes more of a priority, but probably not. We'll just spend it all on more features and bug fixes.


> People complain to me about various things, performance isn't one of them.

which is fine, and you are doing the absolutely correct thing regarding fixing what's being complained about.

But the complaints i keep hearing (and having myself) is that most apps are quite slow, and has been increasingly growing slower over time as updates arrives - mobile phones in particular.


I think this reflects a shift towards all software depending on a remote database, more than some change in programmer attitudes or bad software in general.

Win 9x era software relied entirely on files for sharing, and there was no real notion of conflict resolution or collaboration beyond that. If you were lucky the program would hold a Windows file lock on a shared drive exported over a LAN using SMB and so anyone else who tried to edit a file whilst you'd gone to lunch would get a locking error. Reference data was updated every couple of years when you bought a new version of the app.

This was unworkable for anything but the simplest and tiniest of apps, hence the continued popularity of mainframe terminals well into this era. And the meaning of "app" was different: it almost always meant productivity app on Win 9x, whereas today it almost always means a frontend to a business service.

Performance of apps over the network can be astoundingly great when some care is taken, but it will never be as snappy as something that's running purely locally, written in C++ and which doesn't care about portability, bug count or feature velocity.

There are ways to make things faster and win back some of that performance, in particular with better DALs on the server side, but we can't go backwards to the Win 9x way of doing things.


DAL == data access layer

My benchmark is irfanview. I think I started using it on XP and you got to enjoy the speed from the install (3 clicks and where you'd expect a loading bar you get a "launch or close wizard").

> Let's not go too far with the rose tinted glasses. Win95 apps are speedy if you run them on modern hardware but at the time they were all dog slow because the average Win95 machine was swapping like crazy.

I disagree that those apps were dog slow at the time. They were fairly responsive, in my experience. It's true that loading took longer (thanks to not having SSDs yet), but once the app was up it was fast. Today many apps are slow because software companies don't care about the user experience, they just put out slop that is barely good enough to buy.


> Yes I am baffled how modern apps are painfully slow.

People underestimate how slow the network is, and put a network between the app and its logic to make the app itself a thin HTTP client and "application" a mess of networked servers in the cloud.

The network is your enemy, but people treat it like reading and writing to disk because it happens to be faster at their desk when they test.


I think all developers should test against a Raspberry Pi 3 for running their software (100mbps network link) just to concentrate on making it smaller and faster. This would eradicate the colossal JS libraries that have become the modern equivalent of many DLLs.

Modern iterations of the Pi have 1000Mbit but your statement has an even broader hole based on the context.

Latency.

The issue the parent mentions is one of latency, if you’re in the EU and your application server is in us-east-1 then you’re likely staring at a RTT of 200ms.

The Pi under your desk from NY? 30ms- even less if its a local test harness running in docker on your laptop.


When I want to make old devs cry I send them this link[1]:

https://forum.dlang.org/

I know it's very simple, I know there isn't a lot of media (and definitely no tracking or ads), but it shows what could be possible on the internet. It's just that nobody cares.

[1] Yes, Hacker News is also quite good in terms of loading speed.


Private torrent trackers are generally fast too. My pet peeves is when news websites are slow, because the only content you have is 90% text.

That 10% left of adware is heavy stuff

> you don’t get to be a senior engineer without being the kind of junior engineer that the LLMs are replacing

I disagree: LLM are not replacing the kind of junior engineer who become senior ones. They replace "copy from StackOverflow until I get something mostly working" coders. Those who end going up the management ladder, not the engineering one. LLM are (atm) not replacing the junior engineers who use tools to get an idea then read the documentation.


The best engineers and craftsmen - even juniors - understand which work is core to the craft and the end result, and which is peripheral.

Unless you're hyper-specialised within a large organisation, you can't bring the same degree of obsession to every part in the process, there will always be edges.

Even an artisan who hand-builds everything that matters may take some shortcuts in where they get their tools from, or the products they use to maintain them.

In a big org, you might have a specialist for every domain, but on small teams you don't.

And ultimately I've got other things to do with my life besides learning to write Cmake from scratch.


I'm far from an AI enthusiast but concerning:

> There is a good chance that there will be a generational skill atrophy in the future, as less people will be inclined to develop the experience required to use AI as a helper, but not depend on it.

I don't how to care for livestock or what to do to prepare and can a pig or a cow. I could learn it. But I'll keep using the way of least resistance and get it from my butcher. Or to be more technological: I'd have to learn how to make a bare OS capable of starting from a motherboard, it still does not prevent me from deploying k8s clusters and coding apps to run on it.


> I don't how to care for livestock or what to do to prepare and can a pig or a cow. I could learn it. But I'll keep using the way of least resistance and get it from my butcher

You'd sing a different tune if there was a good chance from being poisoned by your butcher.

The two examples you chose are obvious choices because the dependencies you have are reliable. You trust their output and methodologies. Now think about current LLMs-based agents running your bank account, deciding on loans,...


Sure, but we still will need future generation people to want to learn how to butcher and then actually follow through on being butchers. I guess the implied fear is that people who lack fundamentals and are reliant on AI become subordinate to the machine's whimsy, rather than the other way around.

Maybe its not so much that it prevents anything, rather it will hedge toward a future where all we get is a jpeg of a jpeg of a jpeg. ie. everything will be an electron app or some other generational derivative not yet envisioned yet, many steps removed from competent engineering.

If your butcher felt the same way you did, he wouldn't exist

After reading this post I'm left wondering: you want to capture events. You want to have different views of them. Why don't you use Kafka and create a consumer per "view"?

That's a good question.

First of all, Kafka is still an event streaming platform and lacks database capabilities such as indexing and query optimization. Although ksql/Kafka Streams can perform computations based on consuming data, they require repeatedly pulling data, and there are no technologies like indexing to accelerate queries.

Secondly, dashboards and alerts in monitoring scenarios require a large number of views—these are the “known unknowns”. When dealing with “unknown unknowns” during exploration, it’s necessary to create views dynamically, which may result in a significant increase in the number of views. I’m not sure whether Kafka can handle such situations. Because monitoring requires greater real-time performance, it’s difficult to tolerate delays.


> Study porn

At least they're not studying how to gobble data on everyone to sell ads with a 3 letter agencies backdoor.


> embedded Facebook trackers

And most social trackers and google analytics, and adsense, and most captcha alternatives or stripe anti fraud scripts.

People have sold their audience to FAANG for 2 decades now.

And let's not think too much about the Android and iOS ecosystems (phone, TV, "assistants" etc.).


I'm still thinking a simple expert system used by some nurse used to getting people to explain their symptoms would be enough to replace a general MD.

Less time and money spent to train those nurses, which you can then spend on training specialists. And your expert system will take less time to update than training thousands of doctor every time some new protocol or drug is released.


> leaving only mental capability as a differentiator

I think a huge part of most sports (especially combat ones) is muscle memory. You don't have time to think between moves. So if you want to be good you'll still have to work for days and make your body learn.

And if you think muscle memory is bullshit, try to remember how driving was hard at first and nowadays you can almost sleep through your commute.


I think that’s the same with coding in a stack you really familiar with, especially if you’re fluent in your editor. Sometimes your mind is a few steps ahead of what you’re actually writing. The bulk of the work is below the focus level.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: