I thought I was pretty clear about that. Synchronization is relatively straightforward in a realtime system, where your program has total control over the timing of execution. On a desktop operating system, however, the graphics card and its drivers can lie, the operating system can lie, the operating system can swap your process out for something else whenever it wants, there is a delay associated with accessing a PC's high accuracy timer (so even the clock lies!), and on top of that, everyone's PC is different. It's still possible to get some rough degree of synchronization going, but it's very difficult and imperfect.
If you still don't think I'm being serious about the timing stuff, consider that some folks still keep around old machines running DOS for real-time communication with microcontrollers.
Now this is interesting. What can operating systems, drivers, and GPU manufacturers do to restore a DOS like real time sync of frame generation/transfer/display.
Now that every gamer has a multi-core processor, couldn't you allocate n-1 of them to the game with guaranteed non-preemption, and have one core where pre-emption can occur?
it's kind of a cool idea. I wonder if OS's would ever implement something like this. You would probably need to put some iOS like restrictions on it- only one app at a time, it must be running full screen in focus, and cannot be given that kind of priority in the background. Like having a dedicated built in console system.