To be frank I have always found such dogmatism deeply questionable. I can understand caveats like "for long term maintainability" but taken literally it suggests assinine ideas like using bubble sort instead of merge sort for a large set not because of any memory footprint constraints but because it is easier to read. Insisting upon not referring to reality is not a good pattern.
More charitably, a better argument is that it is easier to adapt a compiler to reoptimize it than to tweak the entire code base - let alone the costs for the gains.
>... but taken literally it suggests assinine ideas like using bubble sort instead of merge sort for a large set not because of any memory footprint constraints but because it is easier to read.
No, it doesn't suggest this at all. It means use the merge sort but implement it in a clean, understandable way.
It is on a spectrum but it is already here to some extent. Look at previous performance practices which have become deprecated like manually unrolling fixed for loops. Those are the sorts of things which have been fixed up and assembly coding is pretty rare still.
More charitably, a better argument is that it is easier to adapt a compiler to reoptimize it than to tweak the entire code base - let alone the costs for the gains.