Were composite video colors that bad, in practice?

I’m wondering if dull, dim, dirty colors are something we really should be pursuing in our composite video shader presets.

I’ve never actually seen a photo of a well-functioning CRT displaying composite video that had dull, dim, dirty colors. The colors are different from RGB, but not really dull or unsaturated. If anything, I’ve noticed composite video on a CRT tends to be a bit more saturated.

Here’s what I think is happening:

-The saturation loss is caused by bleed. Bleed is reduced or eliminated by notch or comb filtering

-What little bleed remains causes a slight saturation loss in those colors that are bleeding

-TV manufacturers often did “nostalgic” calibrations, which probably caused clipping and other “incorrect” things, but it looked good to the consumer (bright and vibrant)

-The consumer then did whatever they wanted with the knobs

In short, I think a generic “composite video look” really comes down to chroma bleed and artifacting. In guest-advanced-ntsc, we should pay attention to the “NTSC Chroma Scaling / Bleeding” parameters, along with “NTSC Artifacting” and “NTSC Fringing.”

For a more nostalgic look, a slight hue shift of red towards orange (or some other subtle shift) can help, to mimic whatever weird things the manufacturer was doing.

I don’t think we really need to touch NTSC Color Saturation or NTSC Brightness, though.

I think the way chroma bleed currently works might be improved if we can get it to actually cause saturation loss where it’s bleeding from, if that makes any sense. edit: Actually I think GTUv50 is doing this. guest-ntsc / ntsc-adaptive kinda do it, but only at a very low setting for chroma bleed and you don’t have full control over YIQ, so it’s not quite as accurate/realistic.

Thoughts?

2 Likes

Saturation was indeed not an issue, in my experience, and as you can see in these comparison shots I made for you a few years ago: How crappy of a TV do you need for composite video artifacts?

Based on my experience trying to vibe-code an NTSC de/modulation pipeline recently, saturation loss is a tradeoff resulting from the notch/comb filtering, as it’s killing a chunk of the chroma to avoid interfering with the luma.

I don’t really have an answer for how real hardware manages to stay super-saturated even after filtering. Maybe there’s some internal chroma boost after the filter stage? Dunno. Maybe @PlainOldPants has some knowledge here.

3 Likes