The fact is, you do need to get rid of more heat when using hot water.
Ergo, some sort of self-reinforcing feedback loop must appear when you use hot water, that sucks heat out of it faster. If you use cold water, the "turbocharged" mechanism does not operate for some reason, and heat is lost at a lower rate.
I've no idea what that mechanism is. It's nonlinear behavior for sure.
A hypothesis I’ve heard is that the hot water is more likely to produce currents inside the water, as there’s more of a temperature difference between the inner water and the water right at the edge. These currents then cause the water to cool more uniformly as opposed to outside-in. I believe the person I heard this from tested this by adding baffling to the container to prevent currents, but I’m not sure what the results of that experiment were.
For example, this is why large amateur telescopes perform worse for about an hour after they're brought outside: the mirror is "hot" compared to the air, so a network of toroidal convection cells (spinning doughnuts of air) forms on its surface, ever so slightly messing with the rays of light.
There's actually an attractor for some of these networks (speaking in general), where they form hexagonal cells. Of course, in most cases, it's more disorderly than that.
I'm not sure how a turbocharged network could form, but I'm looking forward to learn more on this topic.
Ergo, some sort of self-reinforcing feedback loop must appear when you use hot water, that sucks heat out of it faster. If you use cold water, the "turbocharged" mechanism does not operate for some reason, and heat is lost at a lower rate.
I've no idea what that mechanism is. It's nonlinear behavior for sure.