Meh. I feel like there needs to be an active movement to assess programs that have huge scale (>10m users) to identify unnecessary power usage - whether it be because of a bug, because of unused functionality that nonetheless takes resources, or intermediate steps that take unnecessary power.
Perhaps I’m getting into a bit of a niche here, but the rise of stringy formats for data transfer concerns me. There are many-stage pipelines on machines that agree on what a 64 bit integer is, yet each stage performs encoding and decoding of JSON twice (decoding upon receipt, encoding to pass it on to the right place, decoding the response, encoding it in another manner to reply to the original sender). Sounds like a minor concern, but the scale of this instinctively feels like it’d dwarf 250MW globally.
Perhaps I’m getting into a bit of a niche here, but the rise of stringy formats for data transfer concerns me. There are many-stage pipelines on machines that agree on what a 64 bit integer is, yet each stage performs encoding and decoding of JSON twice (decoding upon receipt, encoding to pass it on to the right place, decoding the response, encoding it in another manner to reply to the original sender). Sounds like a minor concern, but the scale of this instinctively feels like it’d dwarf 250MW globally.