> Software efficiency seems to be exponential in many places as well
No, in one place. That single example gets touted again and again, but it's just a single class of algorithms, and much of the cited performance gains are from a small set of well-known example problems. (I took Professor Groetschel's linear optimisation class at university).
In compilers, for example, progress is much more "majestic": Proebsting's law [1] was shown to be wildly optimistic even back in 2001 [2]: "The reality is somewhat grimmer than Proebsting initially supposed." More recently, it seems like progress has pretty much stalled for most applications, for example: "Using Stabilizer, we find that, across the SPEC CPU2006 benchmark suite, the effect of the -O3 optimization level versus -O2 is indistinguishable from noise." [3]
But while returns are diminishing, the costs of obtaining those diminishing returns have gone up tremendously ("exponentially"?) For example, Minix's time to compile itself (including the compiler) apparently went from 10 minutes to over 2 hours when they switched from their own compiler to clang.[4] For Plan9 the same switch caused times to go from 45 seconds to over 2 hours [5].
And with Swift we get even longer compile times, including the compiler giving up on one-liners after a minute, and the result is slower, too!
I could give many more examples, but generally Wirth's law holds: software gets slower faster than hardware gets faster.
No, in one place. That single example gets touted again and again, but it's just a single class of algorithms, and much of the cited performance gains are from a small set of well-known example problems. (I took Professor Groetschel's linear optimisation class at university).
In compilers, for example, progress is much more "majestic": Proebsting's law [1] was shown to be wildly optimistic even back in 2001 [2]: "The reality is somewhat grimmer than Proebsting initially supposed." More recently, it seems like progress has pretty much stalled for most applications, for example: "Using Stabilizer, we find that, across the SPEC CPU2006 benchmark suite, the effect of the -O3 optimization level versus -O2 is indistinguishable from noise." [3]
But while returns are diminishing, the costs of obtaining those diminishing returns have gone up tremendously ("exponentially"?) For example, Minix's time to compile itself (including the compiler) apparently went from 10 minutes to over 2 hours when they switched from their own compiler to clang.[4] For Plan9 the same switch caused times to go from 45 seconds to over 2 hours [5].
And with Swift we get even longer compile times, including the compiler giving up on one-liners after a minute, and the result is slower, too!
I could give many more examples, but generally Wirth's law holds: software gets slower faster than hardware gets faster.
[1] http://proebsting.cs.arizona.edu/law.html
[2] http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=66C...
[3] https://emeryberger.com/research/stabilizer/
[4] https://news.ycombinator.com/item?id=26452155
[5] https://news.ycombinator.com/item?id=26452392