I mean, enough people did for them to continue selling the S-Video cable for N64 and Gamecube.
These strong opinions break down when you consider things like the default PS3 cable being composite and the fact that things like fighting games didn’t come with the arcade stick.
Just because an accessory isn’t included doesn’t mean it was never intended to be used. That’s ridiculous.
Relax buddy, this is not the PS3 i am talking about here nor it’s 2014 era, it’s the 1991 and SNES. There is no internet, no bombarding information all around the place, no downloading PDFs, no cheap arcade sticks from internet e-shops.
“Intended” translates to what devs and the market assume 95% of users will use. The other word you all looking for is “optimal”.
Claude Sonnet gave me a pretty interesting chart about analog and early digital filtering working in gamma space vs. modern video processes today. Video processing working in gamma space results in both desaturation and overblown colors.
| Composite Aspect | Gamma Space | Linear Space |
|---|---|---|
| Bright saturation | Oversaturated, blown-out | More controlled, linear clipping |
| Dark saturation | Desaturated, muddy | Better preserved, but darker overall |
| Dot crawl | Prominent, “rainbow” edges | Reduced, but still present due to frequency overlap |
| Luma/chroma separation | Poor (nonlinear cross-talk) | Better (linear superposition holds) |
Composite was optimal for a whole host of systems. NES, PCE, Sega, Super NES, Sega again, PSX, N64. Arguably even Dreamcast.
Dithering being blended, free AA, MvC2 being less incongruent…composite was sharp and saturated on brand new CRTs that were being cranked out of factories left and right. Center even. Stage right. Stage left even!
Who wants the checkerboard overlay of S-video when you can have clarity and pop of composite? Once PS2/Xbox/GameCube get here it’s component all day. Especially given the rise of progressive CRTs. But on interlaced sets, it’s not like you’re kneecapping yourself to use composite there either.
Comb filters and various other improvements froze dot crawl in place and prevented flickering artifacts. Composite felt like a huge upgrade over RF switches and spade lugs.
The Genesis 2 and 32X even removed the rainbows from the waterfalls.
in fact in PS2 era many if not most games will be worst in progressive than interlaced, since they optimize for interlaced because its the vast majority, Of course, I’m talking about the original hardware; it’s different with emulators. And on modern monitors, you’ll often get a better picture if you use progressive signal because you’ll bypass deinterlace
Just remembered PS2 still used heavy dithering in almost every game.
Xbox and GameCube were just about all dither free though iirc.
video games consoles start using new type of dithering which is Temporal dithering (that depends on the nature of the interlaced in CRT) since they start doing 480i, in PS1 era the bios and some PS1 games like tekken 3 has Temporal dithering
dont know about Xbox, but I think GameCube also use some dithering (I think it’s lighter than ps2)
Composite was the choice for a number of reasons
- 98% of TVs support it, except some very old/junk ones that support RF only
- Dithering blend while still being sharp as a knife (except Sega junk encoders)
- Pretty good quality on a decent TV
- Massive upgrade compared to RF
Even Saturn used fake dither transparencies, waiting for Composite to solve the problem.
GameCube has dither here and there
and that vertical strip like is most likely Temporal dithering
and from what I saw after google search xbox has dither too
in fact dither (Spatio or Temporal) still used even in nowadays pc games (thank to UE5
and taa (and it sisters like dlss) “optimization”)
I don’t remember that dithering in WW at all. Hrm. Progressive Scan mode enabled? WW was an absolute chore to play to be fair, suppressed a lot of that boredom trauma.
This is what Google told me, “the “dithering” in Original Xbox models is an inherent flicker filter in 480i mode, which can be bypassed by using 480p resolution with the appropriate cables and settings.”
Interesting, cause I have no experience playing OG Xbox without component cables as I had my HD CRT by then. (mostly same with GameCube with some Rogue Leader exceptions over composite)
Meanwhile, PS2 over component had legit dithering in most any game I played. Ecco The 3D Dolphin is a low hanging fruit example.
maybe, also that will most likely prove my point about
and about Temporal dithering, here Ps1 Example
the interlace frame
one field of it (CRT television deals with fields not frames)
the other field of it
Notice how the vertical lines (vertical stripes) change position in each field; when this is displayed quickly on interlaced CRT screens with phosphor persistence, you will get a better color gradient 
https://www.reddit.com/r/originalxbox/comments/frl9l3/why_does_metal_gear_solid_2_use_dithering_for/ and as I said, even nowadays games still use some form of dithering but most of the time people don’t notice it
I think the entire problem revolves around the use of the word “intended”
It simply doesn’t follow that devs intended for you to use composite because that’s what most people used/had.
The term you’re looking for is “default configuration”
If we drop the word “intended,” the entire debate dissolves.
How about this: Official accessories were intended to be used, but the degree to which the companies and their developers expected and catered to them varied.
Unofficial accessories and mods are not intended, but can be undeniably highly beneficial to users. This was also the case back in the day. One of the first things I did with my PAL Saturn was getting a region/50-60hz mod.
RGB modding a console is also obviously beneficial because it gives you another option if it allows you to retain the default connections. Otherwise, it becomes a question of trade-offs.
Even if we didn’t, does everyone closely follow all the instructions given by manufacturers, for every single product? Did you never thought “fuck X, I wanna do Y”? Sometimes, you adapt things to suit your tastes better. Other times, you don’t even have a choice and it all becomes moot. Finally, we need to think if an author even intended something, who knows if a bunch of them just used the screen in front of them and chose a “lol whatever” attitude.
I believe this is probably far more common than what many people assume.
Dithering still looks fine on S-Video and RGB, IMO, especially checkerboard dithering. VGA games and handheld games used dithering, too, and those displays were sharper than TVs. Resolving dither patterns relates to how far the screen is from the viewer, or how relatively small it is. The preferred viewing distance for SDTVs was found to be about six times the screen height. Today people generally watch their TVs closer to three times or less, and people use computer monitors even less than that, so we are all looking at this content at a bigger proportion than we used to (which is fine, and it’s nice that we can view screens at these large sizes without getting fatigued now).
Sacrificing overall resolution for smoother dither effects is IMO not worth it esp. for text heavy games which was why I became interested in using S-Video when I was playing on real hardware a long time ago. The Genesis is really an exception because of how much dithering is used (because of its really soft output). Even though there’s an RGB output, the jailbars and lack of official RGB cable suggests it really waa not an intended format for it.
This is something that is definitely frequently overlooked.
It applies to CRT shaders, too - best results are obtained when you sit about 2x the recommended distance for HDTV. I think this also relates to the mask problem we were talking about recently- masks effects are susceptible to falling apart at closer distances, and it happens before the LCD pixels rise to the level of conscious awareness.
That’s a good point- there’s a much stronger case for composite as the intended output on consoles that lacked official RGB or Svideo outputs/cables.
Which brings us to @md2mcb’s point,
Did you never thought “fuck X, I wanna do Y”?
The Genesis composite is so bad that it’s the first system I would want to connect via RGB if I was still into the real hardware scene.
There’s a lot of patterns seen in old PC games that people believe get smoothed out, but I think that’s more an effect of the prominent scanlines, and I’d wager you see this also in some console games.

Now, my multisync monitor has a bit of a low dot pitch for EGA (0.28mm), but I have a hard time seeing that this would really blend with a common EGA 0.39mm monitor (for comparison, TVs at that size had commonly a pitch of 0.64mm, CGA monitors specs I have seen go up commonly to 0.52mm).
The patterns of the first image won’t disappear when displayed with a VGA-like mode on my monitor.
You see patterns for color mixing and whatnot also on a lot of RGB-based Japanese computers:

dont forget the analog video nature that kinda acted like bilinear resize or horizontal Gaussian while in emulators the output will often be a Point Sampler (Nearest Neighbour) algorithm, so back then Dithering even in RGB or VGA will get some smoothing, and that aside from CRT smoothing itself



