Were composite video colors that bad, in practice?

This is a good observation. I noticed the same thing when doing some side by sides with Silent Hill once- the dithering with raw pixels is awful, but just the addition of the scanlines makes the dithering less noticeable. Obviously it’s still there, but the scanlines trick the brain, I guess.

1 Like

many handheld games exploited the ghosting effect in old LCD screens to do some effects like transparency, also the old LCD screens wasn’t that sharp even in still images, also they was so small in size that you can’t feel the dither anomaly

I think Jamirus is showing a picture from an actual monitor.

The bandwidth limit of analog monitors does lead to a smooth color transition, but that bandwidth is well above the point where adjacent pixels would completely blend. A VGA monitor is meant to resolve at least 640 horizontal pixels and most games were in 320 mode, not to mention we continued to use the 320 mode well into the era of SVGA+ monitors. Sometime around the early 90s we started to see SVGA+ monitors become common.

1 Like

I dont think so, but lets wait for his answer to be sure :slight_smile:

Edit: ok, the second image is from actual monitor

anyway, here a real retro PC with real retro monitor https://youtu.be/GGQv_kYXmt4?t=646 (The video is in Arabic, but I added it just for the sake of the zoomed image in timestamp 10:46)

edit: same scene https://youtu.be/QgRIXntFhww?t=504 (but I think it’s in emu)

1 Like

it’s not about bandwidth limit, it’s about digital vs analog

https://superuser.com/questions/338982/is-the-picture-quality-much-better-using-hdmi-versus-using-vga

https://www.reddit.com/r/pcgaming/comments/a11swr/serious_question_how_much_better_is_hdmi_over_vga/

if it were just a bandwidth limit then 240p should be perfect in composite which was made for 480i (I’m talking about horizontal resolution (bandwidth), of course; vertically, the two are more or less identical)

Yes. I don’t know why this still isn’t common knowledge in 2025. NTSC TVs used to decode RF, composite, and S-Video (but not component) with an approximate correction for 1953 primaries and a nonstandard white balance, and this continued even into the 2000s. In the case of a nonstandard white balance, they would only fix red through green and let everything else get thrown off. Why does no one seem to know this on the internet?

Maybe PVMs were different, but I am sure about the consumer TVs.

I happen to have a 1989 RCA ColorTrak Remote (about 13-inch, probably the size of an old computer monitor), with a white balance at around 7500k-8000k, which I’m working on simulating exactly via wide gamut HDR displays (but without success yet). Let’s hope this can give similar results to that TV from the YouTube video.

2 Likes

I actually did that. When I had a PS2, I used it through component cables (composite was too blurry on my screen). Eventually, I played those Genesis compilation games and was pretty impressed by the results. Dithering was fully visible, though everything else was a major improvement. Ironically, the consensus back then was favorable towards the better cables, people would rgb-mod their own mothers if they could. Just another of uncountable cases of the internet unable to decide what to think. If I’m allowed to be frank, it’s tiresome to see all that bandwagon-jumping. My opinion is still the same after more than 20 years: I like clearer graphics, it’s not my fault that some artists chose to rely on a very particular technology shortcoming on which to base their product; many others didn’t go gung-ho on dithering and had moderation.

3 Likes

I think it’s fair to say specific Developers intended Composite with specific games. Wind Waker like Jobima brought up is a good example, because without the Composite connection, you don’t blend those dither patterns into Fog.

The game applies a heavy blur filter over everything that generally made me feel like my Component cables were being wasted. I was always upset about that and would play with a GameShark code that disabled it on principle.

If a game has transparency that requires dithering; Composite is intended. If a game has no dithering, furget about it. For a lot of Super NES games, I notice this to be the case. No dither patterns anywhere in certain games.

NES/PCE/Sega (Genesis/CD/32X/Saturn)/PSX/N64/PS2 it’s hard to think of any examples that lack dithering though. For me. Dreamcast/GameCube and Xbox vary apparently.

If we want to mud the discussion even more, we can consider the usage of dithering in computer graphics in general (before 16-bit “high color”). It wouldn’t blend, but it would still give the false impression of more colors (brain tricks, I suppose). And as an anecdote of life, youngsters seem to like visible dithering in newer games with retro graphics, because it’s “aesthetic” (their words).

gradient dither

3 Likes

Well there’s no accounting for bad taste.

1 Like

Seems that brand (and could be more) cared to keep being relevant to ntsc1953, while maybe others like Sony didn’t. I’ve seen some documents about some early 60s RCA TVs still using ntsc1953 primaries too.

That’s what i see on my CRTs more or less, true temperature should be like 7500k and green is pushed down on CRT compared to sRGB LCDs. If you do something like

color *= vec3(1.0, 0.85, 1.15) or something, LCD colors come a lot closer to CRT. Actually Red is a lot stronger too. But not on Android (some of them), seems the gamut there is larger and that above calculation makes a mess.

1 Like

It’s because there isn’t evidence that that’s what they are actually doing. Lots of people on the Internet know that TVs have been doing various non-standard things with the color, but there isn’t anything that tells us why they were doing it. Correcting for 1953 primaries is an interesting theory, but it’s really just that, a theory, and it’s not really accurate anyway. You still have to empirically tune the math to get it correct for an individual TV set.

2 Likes

Does RCA has a bit more to do with NTSC 1953 color

A great amount of documents about the NTSC

https://www.earlytelevision.org/rca_color_system.html

2 Likes

In my understanding, Loom is an odd beast since it’s EGA. There’s a VGA port on PC and FM-Town that smoothed things out tho. See here: https://youtu.be/H7rUc4XhTZs?si=MzcLjKynulKoHU8-&t=2244

Also another game that was quite challenging to find a proper aspect ratio to play with… Specially with the FMTown port since superior vertical resolution ended up squashing the graphics, specially for the scenes with the crystal ball (see 41:49 in the video). I also ended up using the Rec2020 colors in the most unintended way, yet playing on a sRGB display, just because it made the skin color for that guy look more natural here:

I think the checkerboard dithering can be pleasing, it’s a good looking noise specially on the shots you posted, but most of the time I try to play the games scaled to fit all my screen vertically, and then without shader, the checkerboard ends up uneven and ugly for me. So I tend to adopt the same treatment I would do for audio restoration, if I had to do analogy with sound : checkerboard dithering on a sprite is similar to a low bit-depth file that can be reinterpolated and denoised (with Jinc2 dedithering for exemple), then CRT-shader, NTSC or blur, color correction and noise are used like compressors and exciters, reverb, EQ and re-dithering. All in all it seems I try to use shader chains to process video the same way I would do with audio plugins to process audio, trimming, denoising, resampling, EQing and then adding more complex distortion and noise for a display that would be too clinical for my tastes if played raw.

There is a pretty close relationship of ntsc1953 color TV and RCA, now that’s a damn good reason to keep your own standard relevant lol

I am assuming that went down the road as RCA “trademark” colors so they wanted to keep that playful and full of life color look, even on later TV sets, up to the point of adopting the smpte-c standard or maybe even later on a bit.

3 Likes

also back then the dot pitch was not that good, so that will blend things more https://www.youtube.com/watch?v=m79HxULt3O8

I think the main reason is that they haven’t experienced the older games in their original environments (and this is one of the drawbacks of emulators) And unfortunately, there are even old people who prefer a raster sharp image and consider sharp the most important criterion for quality

That’s basically a scam monitor because the dot pitch is so high for it’s size that it can’t resolve the EGA/VGA standard resolutions. High pitch monitors of comparable size had usually 0.39 mm pitch (dot) or 0.41 (slot).

The original CRT to go with EGA by IBM (the IBM 5154) in 1984 already had a pitch of 0.31mm.

There’s bunch of pics around from a 80s PC-98 monitor with 0.39mm. PC-98 outputs at 640x400.

Source image:

PC-KD854N-PC-98_6_Original

It doesn’t look that much different on my 0.28mm monitor though, patterns just get lost in tiny scanlines.

Back to colors though, the differences for this game on Chris McCovell’s RGB vs.composite page always seemed a little drastic for me. Capture-specific or within range on TVs?

MagicalHat_RGB

MagicalHat-Composite

Game pictured is Magical Hat No Buttobi Tabo! Daibouken. From what I could find, the anime intro looks more like the top, and this looks like it could be from real hardware:

https://www.youtube.com/watch?v=9qKk62aqmAw

2 Likes

Don’t these captures skip the TV’s decoding/color correction?

I know, anyway the point is back then there are no raster sharp image like how we see those games in emus today since they use VGA which is analog and has some smoothing/blur (it’s not as strong as Composite, of course; in fact, it’s the least among the others, but it exists)

Also, even if it was a scam monitor, it was one of the options and not an anomaly. According to the video, it was a common practice to offer cheap options, and most people go with cheap price