Hacker News new | past | comments | ask | show | jobs | submit login

No—Windows takes a strictly worse approach. macOS [and Linux] give you “some [not to spec] monitors have very slightly blurry text”; but Windows gives you “some TVs just plain don’t work at all.”

Note how I didn’t say that the TVs are not-to-spec; that’s because the EDID spec allows TVs to just not report an EDID (TVs are low-margin, and that’s one fewer chip on the signal path), where the video source then must assume some standard default modes (e.g. 1080p@60 YUV 4:2:2.) But monitors, meanwhile, must report an EDID. Why?

Because all this only applies over HDMI; and HDMI is a protocol for talking to TVs first-and-foremost, which is why the EDID spec assumes YUV as the fallback.

Monitors generally have DisplayPort inputs; and if you plug Windows or macOS into a monitor over DisplayPort, it’ll always go SRGB, because YUV isn’t a video mode the DisplayPort protocol can even carry.

So when a monitor has an HDMI input, it’s essentially doing “TV compatibility” over that input—and it needs to report its capabilities with an EDID, lest the video source think it’s a TV and send base TV spec signals to it.

Which all makes the situation even more ridiculous: there are TVs that Windows can’t talk to over HDMI, despite the HDMI connection implying. and the EDID spec explicitly stating, that Windows should make the fallback/safety assumption that it’s talking to a TV.

Also keep in mind that this is a prisoner’s dilemma: the only reason that monitor manufacturers are able to “get away with” being nonconformant to the EDID spec is because of Windows’ leniency. Windows is the “defector” here. If Windows stopped allowing it, then nobody would be allowing it, and so the monitor manufacturers would have to shape up.




Thank you for the detailed explanation. Learned something new.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: