Given the cost of CPU cycles vs Developers I don't think this analogy works.
Sure, you can hire an extra C++ developer for 150k or just give your python developer a company credit card to use for extra AWS machines. I'm quite sure the second option is a lot cheaper.
There are actually two common fallacies here; I'll try to handle them separately.
CPU Cycles vs Developers
This is a common false dichotomy: that expending more CPU cycles on the language runtime makes a language more efficient in terms of developer productivity than a language that expends fewer CPU cycles on its runtime.
The inherent truth of this statement isn't deductively obvious, however.
If you assume, for example, that dynamic languages are more productive for developers (I don't, but it's a common argument, so we'll go with it), then this is easily disproven simply by comparing the performance of a highly optimized JIT -- such as V8's -- against Python's interpreter.
Irrespective of the language, there exists differing levels of quality of runtime. V8 is faster than Python; this is simply because V8 is a better runtime JIT than Python is an interpreter.
If we discard the unproven "dynamic languages are more developer efficient" hypothesis, things become even more stark. The JVM, for example, with real threads, highly optimized JIT, and the advantages of operating on a much more well-typed system, is in fact faster than V8 -- all with a "managed" language, and not C++.
Taking that a step further, we've recently begun rediscovering ahead-of-time compilation of so called "managed" or "high-level" languages. The favoritism given towards JIT arose out of efforts to achieve high performance in dynamic languages were very little can be statically guaranteed. What has recently become clear is this: in more static languages, we can achieve the same level of "managed" runtime without introducing the overhead or complexity of JIT at all!
All combined, I see very little argument for a dichotomous choice between "inefficient, high level language" or "efficient, low level language" -- the choices seem to simply be "inefficient" vs "efficient".
Relative Value of High-Paid Developers
Lastly, I wish to address the "extra C++ developer for 150K". I'll keep this one brief -- simply put, I would hypothesize that a $150K expert-level engineer is worth anywhere from 2-10 $80K non-expert engineers.
This is simply due to an expert-level engineer's experience and deep knowledge of the technology stack allowing them to architect systems to achieve maximum maintenance, developer and system efficiency over time.
Computing is a value multiplier: the potential gains and losses of lower multipliers can be objectively enormous.
Having managed teams where I've inherited cheaper, more junior engineers, versus teams where I've hand-picked a small group of extremely experienced engineers, I've saved time, money, and headaches with the more expensive engineers every time.
Very interesting response. I strongly agree with most of it.
However, I think there's another false dichotomy there: a $150K expert-level engineer versus $80K non-expert engineers. In reality, there are only expert and non-expert engineers---pay doesn't seem to be a particularly discriminator. Further, in all respects it is difficult to tell the difference between an expert and a non-expert, hence all the fun interviewing follies. And even the best of those don't work particularly well.
Thanks for your reply; I concur with both of your assertions.
When it comes to differentiating expertise, we often have to look for secondary indicators; pay has an extremely poor correlation with competence, especially in high-demand job markets.
Sure, you can hire an extra C++ developer for 150k or just give your python developer a company credit card to use for extra AWS machines. I'm quite sure the second option is a lot cheaper.