Hacker News new | past | comments | ask | show | jobs | submit login

Uhm - I'd be inclined to hope that a cluster of GPUs (or just the high end ones being used for something like this) would be capable of rendering at rather higher than 60fps - whilst screen refresh rates may not match that, I would expect each frame to be rendered in rather less than 16ms.

Similarly, by using a decently parallelisable image compression codec (JPEG2000?) it should be possible to compress with the advantage of the many GPUs, too.

16ms transfer time is certainly kept generously low, but beyond that, 16ms decode is something I won't comment on due to the variability of implementation speeds (though it would be done in native code, of course) but 16ms further to display... 32ms to decode and display a JPEG? I'm pretty sure I've seen MJPEG streams at higher than 30fps before!

Back of a napkin they might be, but I think the figures in this post might require further thought.

Edit: again, to be clear, I'm not suggesting that they are using all of these advantages right now, but the idea that this can't reasonably be done for twitch gaming, even today, strikes me as bizarre, when they are trying to set up a system with whatever custom technology is required to make it work.




A GPU (or a cluster of GPUs) might be able to process, say, 10,000 frames in one second. This does not mean that the same GPUs can process one frame in (1/10,000) of a second.

Even with an infinite number of parallel GPUs, there will be an amount of latency required in copying memory to the GPU, running a job, and copying it back. After the frame is compressed, sent over the network, and picked up by the client, further delay (possibly tens of milliseconds) is added on before pixels appear on the screen.

See the discussion around John Carmack's superuser post: http://superuser.com/questions/419070/transatlantic-ping-fas...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: