it's true that all of these things make a pretty noticeable difference when you know what to look for. but you're not going to recognize 60hz or vsync as "unplayable" unless you're very accustomed to playing on a better setup.
> I could understand if the get clock speed was tied to the graphical update rate but I presume that's not the case for CS, server side game etc; or even if it was it's still not going to be that material.
in cs source, the client could not send updates to the server at a faster rate than it was drawing frames. in other words, if you were playing on a 66 tickrate server but only rendering 50 FPS, you were actually losing some simulation fidelity. of course, if you're not the type of person to notice your frame rate dropping to 50 in the first place, you would probably also not notice this consequence. just an interesting technical fact.
60Hz vsync literally feels worse than 50FPS/60Hz uncapped. I believe 32ms of input lag is not in itself blatantly noticeable. But on top of the base lag of every single game made after 1998, it's extremely bad.
> I could understand if the get clock speed was tied to the graphical update rate but I presume that's not the case for CS, server side game etc; or even if it was it's still not going to be that material.
in cs source, the client could not send updates to the server at a faster rate than it was drawing frames. in other words, if you were playing on a 66 tickrate server but only rendering 50 FPS, you were actually losing some simulation fidelity. of course, if you're not the type of person to notice your frame rate dropping to 50 in the first place, you would probably also not notice this consequence. just an interesting technical fact.