The key "ability" that will grow exponentially is AIs ability to convert investment dollars into silicon+electricity and then further reduce those into heat energy. Such schemes only seem wasteful to outsiders, those whose salaries are not tied to their ability to convert money into heat. A fun startup would be one that generates useful electricity from the AI investment cycle. If we put the Ai machine under a pot of water, we might then use the resulting steam to drive a turbine.
Due to Carnot's law, you can't get much electricity that way without a big temperature difference. Think about it: the AI machine would have to run at at least 100 degrees Celsius to boil the water, and that's the bare minimum.
But if we can make computers that run at, say, 2000 degrees, without using several times more electricity, then we can capture their waste heat and turn a big portion of it back into electricity to re-feed the computers. It doesn't violate thermodynamics, it's just an alternative possibility to make more computers that use less electricity overall (an alternative to directly trying to reduce the energy usage of silicon logic gates) as long as we're still well above Landauer's limit.
Some datacentres do in fact recover the heat for things like municipal heating. It's tricky though because being near population centres that can use the heat is often (not always) inversely related to things that are good for datacentres like cheap land, power and lack of neighbours to moan about things like construction and cooling system noise.
There was also a startup selling/renting bitcoin miners that doubled as electrical heaters.
The problem is that computers are fundamentally resistors, so at most you can get 100% of the energy back as heat. But a heat pump can give you 2-4 times the energy back. So your AI work (or bitcoin mining) plus the capital outlay of the expensive computers has to be worth the difference.