That is a poor deflection of the underlying point. It's absurd to conclude that an obviously undesirable behavior -- however unlikely to pose a problem in reality -- should not even be considered let alone addressed.
It could be as simple as a modest global rate limit on repeated GET requests to the same URL. We could start with 250 msec and see how that goes.
Or it could be as simple as limiting F5 reloads to once per keydown. Let users work for their accidental DoS attacks. :-)
what's absurd is protecting the server from a vanishingly rare accident by changing the client. if you feel you need to be protected, put that protection where it belongs, on the server, where it works against more likely things as well.
It could be as simple as a modest global rate limit on repeated GET requests to the same URL. We could start with 250 msec and see how that goes.
Or it could be as simple as limiting F5 reloads to once per keydown. Let users work for their accidental DoS attacks. :-)