I got into doing programming professionally around 2014, which was way after memory and stuff was really much of a concern for 99% of all applications seemingly.
I always do look enviously at back when you had to figure out sort of bootstrappy ways to solve problems due to inherent limitations of the hardware instead of just sitting there gluing frameworks together like a loser.
Am I wrong or was programming just way cooler back then?
For anything that isn’t IO bound you are very often bound by memory access time. CPUs are incredibly fast and feeding them data to work on is very difficult, more so today than before!
The big difference is that a category of programming jobs has appeared where this is almost never a concern because it’s about shoving text around between servers.