if you define efficiency as compute per watt. I don't give a flying crap about watts. Efficiency is measured as amount of work done per hour. Because I get paid for the work, and then I pay the two dollars a week for the watts. I don't care if it's five dollars a week for the watts or two. I do care if it's two hours of waiting time versus five.
lol no. the $20/month cost of electricity is a rounding error for my $6k laptop and the $50k of software licenses for it. It's even less of a rounding error for the datacenter, where a $500k ESX farm that has several million in software on it farm uses $5k of electric per per month including cooling.
Have you noticed almost no one uses ARM? There's a reason for that. Including software being licensed per core, so faster hotter cores and fewer of them win.
They are way more efficient as server cores. https://aws.amazon.com/ec2/graviton/