Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That explains why it was inferior to all of the other standards specialized in the plethora of fields it attempted to handle with mediocrity.


Tell me more about inferiority of pure RGB goodness :)


I don't need to, the tech specs speak for themselves. Maybe try reading/comparing them.

This isn't reddit, take your try-hard trolling elsewhere.


But its you who is trolling. CRT gun is driven with RGB signals, every TV has to convert cvbs/svideo/component input to RGB anyway. Allowing raw RGB input simply skips unnecessary conversions.


You are so right. There is a reason why PC and arcade monitors work in RGB.

But in addition to that, there is no cross-talk, no dot crawl, and all the other weird artifacts of composite video. Also, no chroma modulation means that it was much easier for TVs to support both 50 and 60 Hz. I only ever had a GameCube, but being able to play in RGB at 60 Hz was amazing.

At least in Europe, many old consoles* provided native RGBS output, and with the proper SCART cable the video quality was awesome. Nowadays, people in the retro gaming community go crazy with PVM/BVM and other fancy professional monitors with RGB input, but the French already had it figured out in the '70s!

* anything from Sega, Nintendo SNES~WiiU, the NeoGeo, Sony PSX~PS3.


That has zero effect on how that data gets to said gun. Doing so at 15khz/480i maximum is intrinsically less fidelity than up to 31khz/480p.

In other words: you can send me literal ASCII characters at 1ch/s. 60ch/s via telegraphy would still be superior despite the necessary conversion.

So, congrats on a hobbled standard that got ten extra years of non-use and that you had zero hand in developing.


Why so angry? Thanks to French everyone and their mother in Europe had a TV with RGB (not to mention Svideo) input for that perfect non pixel crawly always the same color picture from VCRs, Cable/Sat set top boxes, fourth gen and up game consoles, and finally 16bit home computers.

Meanwhile in US video inputs were somehow used for market segmentation with consumers forced to use RF modulators :o and later CVBS with a very brief couple years of Component input availability on HDTVs around 2000.


SCART could carry 480p, but few devices supported it.

In any case, this was only an issue in the 6th generation of consoles (Dreamcast, PS2, Gamecube, Xbox). Before that era, everything was 15 kHz, and after that, everything started to include HDMI. The 6th generation was caught in between.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: