Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Typing lag is such a sad result of all our modern computing abstractions.

https://www.extremetech.com/computing/261148-modern-computer...



Hm. I've always thought it was more of a result of our current display technology? Digital displays buffer an entire frame before they display it. Sometimes several frames. And the refresh rate is usually 60 Hz so each buffered frame adds a delay of 16 ms. CRTs on the other hand have basically zero latency because the signal coming in directly controls the intensity of the beam as it draws the picture.

Anyway, is it any better on displays that have a higher refresh rate? I feel like it should make a substantial difference.


CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen. If the electron beam is halfway down the screen and you change a pixel right above where the beam just painted, you'll have to wait 16 ms before you see anything change.

All CRT displays attached to computers in the last 40 years were driven from memory buffers just like LCDs, and those buffers were typically only allowed to change while the electron beam was "off", i.e. moving from the bottom of the screen back to the top. Letting the buffer change while the beam is writing results in "tearing" the image, which was usually considered a bad thing.


> CRTs are potentially worse.

Video game aficionados would like to have a word with you:

https://www.wired.com/story/crt-tube-tv-hot-gaming-tech-retr...

To be fair, much of this is the color and shape rendering, where pixel art had been tailored for CRTs.

Twitchy gamers do swear by “zero input lag” but are perhaps just nostalgic, difference is likely to be 8ms vs. 10ms:

“Using the industry-standard definition of input lag, 60Hz CRTs don't have 0ms input lag. 60Hz CRTs have 8.3ms of input lag…”

https://www.resetera.com/threads/crts-have-8-3ms-of-input-la...


As you said that article seemed to be more about the appearance of objects on a CRT than lag, and I kind of agree with the nostalgia crowd in that respect. But [raster] CRT lag is always going to be 16ms (worst case) and will never be better, while LCDs can in theory run much faster as technology improves.

If we shift the discussion to vector CRTs (which have no pixels) such as the one the old Tempest [0] game used, the CRT has a major advantage over an LCD and the lag can in principle be whatever the application programmer wants it to be. I miss vector games and there's really no way to duplicate their "feel" with LCDs.

[0] https://en.m.wikipedia.org/wiki/Tempest_(video_game)


> CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen.

Back when I had CRTs, 60Hz displays were the older, less-common, cheapo option. I'm having a hard time remembering a CRT display that wasn't at least 75Hz (I believe this was the VESA standard for the minimum to be flicker-free), but most of the monitors I used had refresh rates in the 80-90Hz range. I remember a beautiful higher-end CRT that had a refresh rate around 110Hz.

85Hz gives you a frame time of 11ms, which doesn't sound much better, but is a 30% improvement over 16ms.


Before multi-sync CRTs and SVGA, 60Hz was not the "cheapo" option.


I don't think you can get a display slower than a TV, and they do in fact update at ~60Hz (or 50Hz, depending on region). Of course you're probably only getting VGA, 240p, or less in terms of pixels.


> CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen. If the electron beam is halfway down the screen and you change a pixel right above where the beam just painted, you'll have to wait 16 ms before you see anything change.

This is exactly the same as LCDs though, no? LCDs are also drawing an entire frame at a time, they're not "random access" for lack of a better term. There's just typically no image processing going on with a CRT* though, so there's no inherent latency beyond the speed of the electron gun and the speed of light.

*I'm aware there were some later "digital" CRT models that did analog->digital conversion, followed by some digital signal processing on the image, then digital->analog conversion to drive the gun.


I don't think that LCD buffer anything. I've experienced screen tearing which should not happen with buffering. Most applications implement some kind of vsync which introduces buffering and related delays indeed.

Best option is to use adaptive sync and get rid of vsync. But support for this technology is surprisingly not mature, it works mostly in games.


Screen tearing happens when the software swaps buffers on the GPU while the GPU is in the middle of reading out a buffer to send to the monitor. That tearing has nothing at all to do with whatever buffering is or isn't happening in the display itself, because the discontinuity is actually present in the data stream sent to the display.


See also this talk by John Carmack about buffering problems in input and output stacks.

https://www.youtube.com/watch?v=lHLpKzUxjGk


Sounds broken? Return it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: