If you count the efficiency of the steam turbines to actually generate electricity from the fusion thermal energy you'd need something like 750MJ of fusion energy to break even. (assuming your steam turbines are 40% efficient)
Given that you'd want to actually generate electricity rather than just break even we're talking about three orders of magnitude rather than two.
First computer fit in a hangar, consumed enormous amount of energy and provided a tiny fraction of the computing power that you now have in your smartphone.
While you are right, sometimes I can't help but feel like Moore's Law (etc) has done us a disservice by making it so we compare every kind of technological progress to the progress in computer hardware (or I guess electronics more broadly) and expect that kind of progress in other domains. Are there any other fields that have experienced the same sort of staggering, exponential improvement? Off the top of my head, think of say, food/agriculture, biology, aerospace engineering, construction engineering, etc. All have seen steady, impressive improvements, but nothing comparable to the steady (over many decades), yet exponential improvement of Moore's Law - nothing comparable to going from room-sized computers to having 1000x the compute power in a smartphone chip.
(EDIT: This isn't to say that those fields are worse, or the scientists there less skilled, or something. They're just different domains. "Increase transistor density" may simply just be an easier problem to solve - despite being an incredibly difficult problem - than the issues in those fields.)
I'm going off on a tangent a bit, but all I'm trying to say is, I feel like "if electronics manufacturing can improve at X rate, then surely Y field can also improve at that rate" is a bit of a fallacy.
Of course you're right in general but the fusion triple product actually did increase exponentially, at a faster pace than Moore's Law, from 1970 to 2000. Then for a while everybody decided to put most of the money in a giant construction project in France that still isn't finished. Now we're partway back to the system of competing smaller projects that we had during the exponential period.
Lasers have also been improving dramatically. In particular the power of fast lasers has been going up exponentially.
a computer simply automates something you can do with your bare hands: calculate. Manipulating the strong nuclear force is not even comparable.
My opinion about fusion is that by the time they figure it out (which I think could eventually be done, if we invest a large portion of humanity's knowledge and wealth), it won't even be worth it. We could have almost-free energy now with fission, and renewables keep getting better. Fusing atoms (and getting more energy back) will be an astonishing feat when we accomplish it, but not offer much benefit over existing power generation. For instance, financially it would take a lifetime to ever recover the costs invested. Even once it's figured out, it will still take decades to build the plants, which will be buggy-first-generation models (that still contain dangerous radiation, just more manageable). I really wanted it to succeed (20 years ago, say), but now I think it's a lost cause.
For computing devices being smaller typically means using less energy as well, so it’s a bit different than a power generation facility where the whole point is power.
Given that you'd want to actually generate electricity rather than just break even we're talking about three orders of magnitude rather than two.