Were composite video colors that bad, in practice?

Dithering still looks fine on S-Video and RGB, IMO, especially checkerboard dithering. VGA games and handheld games used dithering, too, and those displays were sharper than TVs. Resolving dither patterns relates to how far the screen is from the viewer, or how relatively small it is. The preferred viewing distance for SDTVs was found to be about six times the screen height. Today people generally watch their TVs closer to three times or less, and people use computer monitors even less than that, so we are all looking at this content at a bigger proportion than we used to (which is fine, and it’s nice that we can view screens at these large sizes without getting fatigued now).

Sacrificing overall resolution for smoother dither effects is IMO not worth it esp. for text heavy games which was why I became interested in using S-Video when I was playing on real hardware a long time ago. The Genesis is really an exception because of how much dithering is used (because of its really soft output). Even though there’s an RGB output, the jailbars and lack of official RGB cable suggests it really waa not an intended format for it.

1 Like

This is something that is definitely frequently overlooked.

It applies to CRT shaders, too - best results are obtained when you sit about 2x the recommended distance for HDTV. I think this also relates to the mask problem we were talking about recently- masks effects are susceptible to falling apart at closer distances, and it happens before the LCD pixels rise to the level of conscious awareness.

That’s a good point- there’s a much stronger case for composite as the intended output on consoles that lacked official RGB or Svideo outputs/cables.

Which brings us to @md2mcb’s point,

Did you never thought “fuck X, I wanna do Y”?

The Genesis composite is so bad that it’s the first system I would want to connect via RGB if I was still into the real hardware scene.

1 Like

There’s a lot of patterns seen in old PC games that people believe get smoothed out, but I think that’s more an effect of the prominent scanlines, and I’d wager you see this also in some console games.

loom-dos-the-emerald-city

Now, my multisync monitor has a bit of a low dot pitch for EGA (0.28mm), but I have a hard time seeing that this would really blend with a common EGA 0.39mm monitor (for comparison, TVs at that size had commonly a pitch of 0.64mm, CGA monitors specs I have seen go up commonly to 0.52mm).

The patterns of the first image won’t disappear when displayed with a VGA-like mode on my monitor.

You see patterns for color mixing and whatnot also on a lot of RGB-based Japanese computers:

jesus-ii-pc-88

4 Likes

dont forget the analog video nature that kinda acted like bilinear resize or horizontal Gaussian while in emulators the output will often be a Point Sampler (Nearest Neighbour) algorithm, so back then Dithering even in RGB or VGA will get some smoothing, and that aside from CRT smoothing itself

1 Like

This is a good observation. I noticed the same thing when doing some side by sides with Silent Hill once- the dithering with raw pixels is awful, but just the addition of the scanlines makes the dithering less noticeable. Obviously it’s still there, but the scanlines trick the brain, I guess.

1 Like

many handheld games exploited the ghosting effect in old LCD screens to do some effects like transparency, also the old LCD screens wasn’t that sharp even in still images, also they was so small in size that you can’t feel the dither anomaly

I think Jamirus is showing a picture from an actual monitor.

The bandwidth limit of analog monitors does lead to a smooth color transition, but that bandwidth is well above the point where adjacent pixels would completely blend. A VGA monitor is meant to resolve at least 640 horizontal pixels and most games were in 320 mode, not to mention we continued to use the 320 mode well into the era of SVGA+ monitors. Sometime around the early 90s we started to see SVGA+ monitors become common.

1 Like

I dont think so, but lets wait for his answer to be sure :slight_smile:

Edit: ok, the second image is from actual monitor

anyway, here a real retro PC with real retro monitor https://youtu.be/GGQv_kYXmt4?t=646 (The video is in Arabic, but I added it just for the sake of the zoomed image in timestamp 10:46)

edit: same scene https://youtu.be/QgRIXntFhww?t=504 (but I think it’s in emu)

1 Like

it’s not about bandwidth limit, it’s about digital vs analog

https://superuser.com/questions/338982/is-the-picture-quality-much-better-using-hdmi-versus-using-vga

https://www.reddit.com/r/pcgaming/comments/a11swr/serious_question_how_much_better_is_hdmi_over_vga/

if it were just a bandwidth limit then 240p should be perfect in composite which was made for 480i (I’m talking about horizontal resolution (bandwidth), of course; vertically, the two are more or less identical)

Yes. I don’t know why this still isn’t common knowledge in 2025. NTSC TVs used to decode RF, composite, and S-Video (but not component) with an approximate correction for 1953 primaries and a nonstandard white balance, and this continued even into the 2000s. In the case of a nonstandard white balance, they would only fix red through green and let everything else get thrown off. Why does no one seem to know this on the internet?

Maybe PVMs were different, but I am sure about the consumer TVs.

I happen to have a 1989 RCA ColorTrak Remote (about 13-inch, probably the size of an old computer monitor), with a white balance at around 7500k-8000k, which I’m working on simulating exactly via wide gamut HDR displays (but without success yet). Let’s hope this can give similar results to that TV from the YouTube video.

2 Likes

I actually did that. When I had a PS2, I used it through component cables (composite was too blurry on my screen). Eventually, I played those Genesis compilation games and was pretty impressed by the results. Dithering was fully visible, though everything else was a major improvement. Ironically, the consensus back then was favorable towards the better cables, people would rgb-mod their own mothers if they could. Just another of uncountable cases of the internet unable to decide what to think. If I’m allowed to be frank, it’s tiresome to see all that bandwagon-jumping. My opinion is still the same after more than 20 years: I like clearer graphics, it’s not my fault that some artists chose to rely on a very particular technology shortcoming on which to base their product; many others didn’t go gung-ho on dithering and had moderation.

3 Likes

I think it’s fair to say specific Developers intended Composite with specific games. Wind Waker like Jobima brought up is a good example, because without the Composite connection, you don’t blend those dither patterns into Fog.

The game applies a heavy blur filter over everything that generally made me feel like my Component cables were being wasted. I was always upset about that and would play with a GameShark code that disabled it on principle.

If a game has transparency that requires dithering; Composite is intended. If a game has no dithering, furget about it. For a lot of Super NES games, I notice this to be the case. No dither patterns anywhere in certain games.

NES/PCE/Sega (Genesis/CD/32X/Saturn)/PSX/N64/PS2 it’s hard to think of any examples that lack dithering though. For me. Dreamcast/GameCube and Xbox vary apparently.

If we want to mud the discussion even more, we can consider the usage of dithering in computer graphics in general (before 16-bit “high color”). It wouldn’t blend, but it would still give the false impression of more colors (brain tricks, I suppose). And as an anecdote of life, youngsters seem to like visible dithering in newer games with retro graphics, because it’s “aesthetic” (their words).

gradient dither

3 Likes

Well there’s no accounting for bad taste.

1 Like

Seems that brand (and could be more) cared to keep being relevant to ntsc1953, while maybe others like Sony didn’t. I’ve seen some documents about some early 60s RCA TVs still using ntsc1953 primaries too.

That’s what i see on my CRTs more or less, true temperature should be like 7500k and green is pushed down on CRT compared to sRGB LCDs. If you do something like

color *= vec3(1.0, 0.85, 1.15) or something, LCD colors come a lot closer to CRT. Actually Red is a lot stronger too. But not on Android (some of them), seems the gamut there is larger and that above calculation makes a mess.

1 Like

It’s because there isn’t evidence that that’s what they are actually doing. Lots of people on the Internet know that TVs have been doing various non-standard things with the color, but there isn’t anything that tells us why they were doing it. Correcting for 1953 primaries is an interesting theory, but it’s really just that, a theory, and it’s not really accurate anyway. You still have to empirically tune the math to get it correct for an individual TV set.

2 Likes

Does RCA has a bit more to do with NTSC 1953 color

A great amount of documents about the NTSC

https://www.earlytelevision.org/rca_color_system.html

2 Likes

In my understanding, Loom is an odd beast since it’s EGA. There’s a VGA port on PC and FM-Town that smoothed things out tho. See here: https://youtu.be/H7rUc4XhTZs?si=MzcLjKynulKoHU8-&t=2244

Also another game that was quite challenging to find a proper aspect ratio to play with… Specially with the FMTown port since superior vertical resolution ended up squashing the graphics, specially for the scenes with the crystal ball (see 41:49 in the video). I also ended up using the Rec2020 colors in the most unintended way, yet playing on a sRGB display, just because it made the skin color for that guy look more natural here:

I think the checkerboard dithering can be pleasing, it’s a good looking noise specially on the shots you posted, but most of the time I try to play the games scaled to fit all my screen vertically, and then without shader, the checkerboard ends up uneven and ugly for me. So I tend to adopt the same treatment I would do for audio restoration, if I had to do analogy with sound : checkerboard dithering on a sprite is similar to a low bit-depth file that can be reinterpolated and denoised (with Jinc2 dedithering for exemple), then CRT-shader, NTSC or blur, color correction and noise are used like compressors and exciters, reverb, EQ and re-dithering. All in all it seems I try to use shader chains to process video the same way I would do with audio plugins to process audio, trimming, denoising, resampling, EQing and then adding more complex distortion and noise for a display that would be too clinical for my tastes if played raw.

There is a pretty close relationship of ntsc1953 color TV and RCA, now that’s a damn good reason to keep your own standard relevant lol

I am assuming that went down the road as RCA “trademark” colors so they wanted to keep that playful and full of life color look, even on later TV sets, up to the point of adopting the smpte-c standard or maybe even later on a bit.

3 Likes

also back then the dot pitch was not that good, so that will blend things more https://www.youtube.com/watch?v=m79HxULt3O8