CRTs used to be cheap because they were made in high volumes and had a large ecosystem of parts suppliers. If you were to make a CRT today, you'd need to fabricate a lot more parts yourself, and the low volume production would require charging very high prices. You'd also have to deal with more stringent environmental laws, as CRTs contain many toxins, including large amounts of lead.
It's much cheaper to emulate CRT effects so that they work with any display technology. Modern LCDs and OLEDs have fast enough response times that you can get most CRT effects (and omit the ones you dislike, such as refresh flicker). And you don't have to deal with a heavy, bulky display that can implode and send leaded glass everywhere.
Unfortunately, the flicker is essential for the excellent motion quality CRTs are renowned for. If the image on the screen stays constant while you eyes are moving, the image formed on your retina is blurred. Blurbusters has a good explanation:
CRT phosphors light up extremely brightly when the electron beam hits them, then exponentially decay. Non-phosphor-based display technologies can attempt to emulate this by strobing a backlight or lighting the pixel for only a fraction of the frame time, but none can match this exponential decay characteristic of a genuine phosphor. I'd argue that the phosphor decay is the most important aspect of the CRT look, more so than any static image quality artifacts.
There is such a thing as a laser-powered phosphor display, which uses moving mirrors to scan lasers over the phosphors instead of an electron beam, but AFAIK this is only available as modules intended for building large outdoor displays:
But why would the flicker be considered "excellent motion quality"?
In real life, there's no flicker. Motion blur is part of real life. Filmmakers use the 180-degree shutter rule as a default to intentionally capture the amount of motion blur that feels natural.
I can understand why the CRT would reduce the motion blur, in the same way that when I super-dim an LED lamp at night and wave my hand, I see a strobe effect instead of smooth motion, because the LED is actually flickering on and off.
But I don't understand why this would ever be desirable. I view it as a defect of dimmed LED lights at night, and I view it as an undesirable quality of CRT's. I don't understand why anyone would call that "excellent motion quality" as opposed to "undesirable strobe effect".
Or for another analogy, it's like how in war and action scenes in films they'll occasionally switch to a 90-degree shutter (or something less than 180) to reduce the motion blur to give a kind of hyper-real sensation. It's effective when used judiciously for a few shots, but you'd never want to watch a whole movie like that.
Sample-and-hold causes smearing when your eyes track an image that is moving across the screen. That doesn't happen in the real world: if you follow an object with your eyes it is seen sharply.
With strobing, moving objects still remain sharp when tracked.
You're correct, but sadly most games and movies are made with low frame rates. Even 120fps is low compared to what you need for truly realistic motion. Flicker is a workaround to mitigate this problem. The ideal solution would be 1000fps or higher on a sample-and-hold display.
> Flicker is a workaround to mitigate this problem.
Isn't motion blur the best workaround to mitigate this problem?
As long as we're dealing with low frame rates, the motion blur in movies looks entirely natural. The lack of motion blur in a flicker situation looks extremely unnatural.
Which is why a lot of 3D games intentionally try to simulate motion blur.
And even if you're emulating an old 2D game designed for CRT's, I don't see why you'd prefer flicker over sample-and-hold. The link you provided explains how sample-and-hold "causes the frame to be blurred across your retinas" -- but this seems entirely desirable to me, since that's what happens with real objects in normal light. We expect motion blur. Real objects don't strobe/flicker.
(I mean, I can get you might want flicker for historical CRT authenticity, but I don't see how it could be a desirable property of displays generally.)
>Isn't motion blur the best workaround to mitigate this problem?
Motion blur in real life reacts to eye movement. When you watch a smoothly moving object, your eye accurately tracks it ("smooth pursuit") so that the image of that object is stationary on your retina, eliminating motion blur. If there are multiple objects moving in different directions you can only track one of them. You can choose where you want the motion blur just by focusing your attention. If you bake the motion blur into the video you loose this ability.
I guess it just comes down to aesthetic preference then.
If there's motion blur on something I'm tracking in smooth pursuit, it doesn't seem particularly objectionable. (I guess I also wonder how accurate the eye's smooth pursuit is -- especially with fast objects in video games, surely it's only approximate and therefore always somewhat blurry anyways? And even if you're tracking an object's movement perfectly, it can be still be blurry as the video game character's arms move, its legs shift, its torso rotates, etc.)
Whereas if there's a flicker/strobe effect, that feels far more objectionable to me.
At the end of the day, my eyes are used to motion blur so a little bit extra on an object my eye is tracking doesn't seem like a big deal -- it still feels natural. Whereas strobe/flicker seems like a huge deal -- extremely unnatural, jumpy and jittery.
You should be able to emulate close to CRT beam scanout + phosphor decay given high enough refresh rates.
Eg. given a 30 Hz (60i) retro signal, a 480 Hz display has 16 full screen refreshes for each input frame, while a 960 Hz display has 32. 480 Hz already exists, and 960 Hz are expected by end of the decade.
You essentially draw the frame over and over with progressive darkening of individual scan lines to emulate phosphor decay.
In practice, you'd want to emulate the full beam scanout and not even wait for full input frames in order to reduce input lag.
Mr. Blurbuster himself has been pitching this idea for awhile, as part of the software stack needed once we have 960+ Hz displays to finally get CRT level motion clarity. For example:
> Eg. given a 30 Hz (60i) retro signal, a 480 Hz display has 16 full screen refreshes for each input frame, while a 960 Hz display has 32. 480 Hz already exists, and 960 Hz are expected by end of the decade.
Many retro signals are 240p60 rather than 480i60. Nearly everything before the Playstation era.
Is there actually a fundamental physical limit in modern (O)LED displays not being able to emulate that “flicker”, or is merely that all established display driver boards are unable to do it because it isn’t a mainstream requirement? If so, it would still be much cheaper to make an FPGA-powered board that drives a modern panel to “simulate” (in quotes because it may not be simulating, instead merely avoiding to compensate for by avoiding the artificial persistence) the flicker than bootstrapping a modern CRT supply chain?
The reason why this is a difficult problem is that physically emulating the flicker requires emulating the beam and phosphor decay, which necessitates a far higher refresh rate than just the input refresh rate. You'd need cutting-edge extremely high refresh rate monitors. The best monitor I found runs at 500hz, but pushing the limits like that usually means concessions in other departments. Maybe you could do it with that one.
My LG has something like that, OLED motion pro. I believe it displays blank frames given the panel runs at higher than 24fps. Medium is noticeably darker but oleds have plenty of brightness for my viewing space and it makes slow pans look much nicer. High is even darker but adds noticeable flicker to my eyes
But the refresh rate needs to match the frame rate to get the best motion quality. If you display the same frames multiple times you'll get ghost images trailing the motion. Lots of games are locked to lower frame rates, and there's barely any 72fps video.
Looking at that Dallibor Farney company and how hard it is for them to get new nixie tubes to be a sustainable business, I shudder to think how much more effort it would be to get new, high quality CRTs off the ground. It would be cool though. A good start might be bringing back tube rebuilding more widely.
I think it's one of these things that people like to talk about in the abstract, but how many people really want a big CRT taking up space in their home?
Modern OLED displays are superior in every way and CRT aesthetics can be replicated in software, so a more practical route would be probably to build some "pass-through" device that adds shadow mask, color bleed, and what-have-you. A lot cheaper than restarting the production of cathode-ray tubes.
I recently bought a big CRT to take up space in my home.
Yes, of course, "objectively" speaking, an OLED display is superior. It has much better blacks and just better colors with a much wider gamut in general. But there's just something about the way a CRT looks - the sharp contrast between bleeding colors and crisp subpixels, the shadows that all fade to gray, the refresh flicker, the small jumps the picture sometimes makes when the decoding circuit misses an HBLANK - that's hard to replicate just in software. I've tried a lot of those filters, and it just doesn't come out the same. And even if it did look as nice, it would never be as cool.
Retro gaming has to be retro. And to be honest, the CRT plays Netflix better as well. It doesn't make you binge, you see? Because it's a little bit awful, and the screen is too small, and you can't make out the subtitles if you sit more than two meters away from the screen, and you can't make out anything if you sit closer than that.
Does that mean we have to restart the production of cathode-ray tubes? Hopefully not. But you can't contain the relics of an era in a pass-through device from jlcpcb.
If the display is working and the input layout isn't changing, you shouldn't accept any jumps at all. If the sync signals are coming at the same rate, the display should remain steady. (Well - as steady as you get with a CRT.) If they don't: it's broken.
> Modern OLED displays are superior in every way and CRT aesthetics can be replicated in software, so a more practical route would be probably to build some "pass-through" device that adds shadow mask, color bleed, and what-have-you.
OLEDs are still behind on motion clarity, but getting close. We finally have 480 Hz OLEDs, and seem to be on track to the 1000Hz needed to match CRTs.
The Retrotink 4k also exists as a standalone box to emulate CRTs and is really great. The main problem being it's HDMI 2.0 output, so you need to choose between 4k60 output with better resolution to emulate CRT masks/scan lines, or 1440p120 for better motion clarity.
Something 4k500 or 4k1000 is likely needed to really replace CRTs completely.
Really hoping by the time 1000 Hz displays are common we do end up with some pass-through box that can fully emulate everything. Emulating full rolling CRT gun scan out should be possible at that refresh rate, which would be amazing.
1000Hz is enough to match CRT quality on a sample-and-hold display, but only when you're displaying 1000fps content. A great many games are limited to 60fps, which means you'll need to either interpolate motion, which adds latency and artifacts, or insert black frames (or better, black lines for a rolling scan, which avoids the latency penalty), which reduces brightness. Adding 16 black frames between every image frame is probably going to reduce brightness to unacceptable levels.
The brightest CRTs were those used in CRT projectors. These had the advantage of using three separate monochrome tubes, which meant the whole screen could be coated in phosphor without any gaps, and they were often liquid cooled.
Direct-view color CRTs topped out at about 300 nits, which is IMO plenty for non-HDR content.
For smooth and fast motion, yes. Although I don't have such fast displays for testing, you can simulate the effect of sample-and-hold blur by applying linear motion blur in a linear color space. A static image (e.g. the sample-and-hold frame) with moving eyeballs (as in smooth pursuit eye tracking) looks identical to a moving image with static eyeballs, and the linear motion blur effect gives a good approximation of that moving image.
You should probably watch one of the old films about how CRTs were made. It's not a simple process and basically would require setting up a whole factory to mass produce them.
Hobbyist-level production of monochrome TV tubes is possible, but a big effort. Some of the early television restorers have tried.[1] Color, though, is far more complicated. A monochrome CRT just has a phosphor coating inside the glass. A color tube has photo-etched patterns of dots aligned with a metal shadow mask.
CRT rebuilding, where the neck is cut off, a new electron gun installed, and the tube re-sealed and evacuated, used to be part of the TV repair industry. That can be done in a small-scale workshop.
There's a commercial business which still restores CRTs.[2] Most of their work is restoring CRTs for old military avionics systems. But there are a few Sony and Panasonic models for which they have parts and can do restoration.
A practical thing about costs is likely shipping. There aren't many consumer products that would be more costly to move around, so you're looking at something as messy as a fridge to sell at the high end.
I imagine one could target smaller CRTs as an idea though.
I know there have been conversations here about simulating crt subpixels on hidpi displays. There are some games that used subpixel rendering to achieve better antialiasing. With hidpi you at least have a chance of doing it well.