I think manufacturers gave their worst when sending TVs to my country. Up until the 00s, I only knew two kinds of screens: “crappy” and “eh-whatever”. I suppose richer people had access to better stuff, but it’s still a stark contrast to see such a good picture on a screen from the 80s.
one reason LCD TVs took over by storm was how less tiresome they were to watch television and had sharper and clearer colors.
It is mostly among retro video games and TV enthusiasts that old tvs are praised. In case of analog film or vinyl discs for example one can make an argument about quality and that argument was still prevalent during the CD and dvd era. Also for tv, it was impossible to get hold of betacam tapes, so one had to be content with low quality vhs tapes. I did not praise vhs quality even back then when compared to broadcast quality.
But for retro consoles, it emerged few decades later. It had mostly to do that in contrast to film, the video game industry imploded after the PS2 era and people started shifting to retro games
I remember seeing some higher-grade VHS tapes, as a couple of acquaintances of mine were AV nerds and liked to tape everything in pristine quality. I’m not privy to details, but the quality is indeed much better. Regardless, it was not the norm. The VHS tapes I had and saw around to be bought were very common ones, and their sole purpose seemed to be used for recording your average broadcast which otherwise you would miss. And that’s how I used them, heh.
Anyway, what you wrote is what I also see around me. That’s my experience too. Whenever I go to a richer person house and see those modern and expensive TVs, I’m always impressed at how good they are. CRTs are cool, but are niche, no doubt about it; beyond retrogaming, I fail to see their use.
Let’s do a quick math, starting not with the date those devices were created, but when they were popularized:
- CRT TVs: became mainstream in the mid-50s, started to be phased out in the mid-00s.
- Modern TVs: became mainstream in the mid-00s, they’re still going on.
In other words, CRTs had around 50 years of development; modern screens have less than half of that, and already show mostly benefits over the previous technology. I wish shader development also had room to make the best of both worlds: instead of focusing on forcing a modern screen to behave like an old one, it could leverage the most important points the previous tech had and adapt them (not transpose 1:1) to our current technology.
Probably the cable that came with the machine is the “intended” one. Even if it supports a 10 buttons gamepad, if it came with a 2 buttons then that’s the “intended” one. Default factory configuration is what 90% of users and software devs will use in the end.
This is Nintendo Power magazine. Nintendo is dedicating a full page in the official Nintendo Magazine to tell people to use S video.
Why include S video on the console? It’s more costly to add it, cheaper to just remove it. Heck, SNES natively supports RGB. All you need is a cable.
Nintendo did remove RGB from later consoles- it’s a cost saving measure. Same with GameCube- first models had component, then they removed it because of how expensive it was. So, Nintendo including S video is a very deliberate choice; they thought it was worth it.
I think its very existence is proof enough that it was intended to be used. It seems so strange to say “s-video is there, and even RGB, but you’re not supposed to use those.”
In PS1 manual (at least in SCPH 7000) sony also said for better quality you should get s-video cable if your TV has it
but, how many people had S-Video on their TV, let alone bothered to read and buy an additional cable? the irony is that there is an official RF adapter, which work even in PS3
so at least in PS1 and before it I think the main target was composite, targeting S-Video make sense in PS2 era since I noted they exploit Interlaced fields temporal blending (whether through the eye or phosphor persistence) instead of rely in exploiting composite characteristics like in PS1 era and before it
Enthusiasts at the time would have used the best connection available. So which is correct?
The average, “nostalgic” experience, or
The enthusiast’s choice?
The answer is: both. Take your pick 
Not all TVs had s-video, while 98% had composite. That’s the reason. Even the c64 had s-video, it was an option, if you had a capable monitor then you would use it. I am not sure it increases cost, the encoder simply outputs RF, composite, s-video and RGB. It’s an ability of the chip already, these chips probably were quite cheap by then.
It does increase cost, but not through the chips. It’s the cables. Not many people willing to shell out $100 for a cable
Nintendo would buy the chip that does all signals for something like 0,4$ per unit while if using a junk encoder would cost 0,2$. It’s no serious cost to let it out. Then if they shipped the consoles with s-video, 50% of users would send it back because not possible to plug it anywhere.
Manufacturing the cables is what got expensive, and there weren’t enough people willing to buy them to justify the cost to Nintendo, I guess.
Interesting that Sega and Sony just kept adding S-video and RGB to everything, despite this. And S-video remained an option on all Nintendo consoles after NES despite 98% not knowing what it is. Odd choice if they didn’t “intend” for you to use it. Also odd that they’d take out a full page telling people to use it and describing how to use it in detail… 
That ad is a noob’s guide, suggesting he is still using the RF, it says take it out and use the AV cable, plug the yellow cable, the stereo sound etc, like a baby’s instructions. Then it states “IF” you got a TV capable of S-video use it for better quality. Which was pretty obvious for anyone that had a slight experience with video games.
Of course all available connections are intended to be used (or else they wouldn’t be there?) as per the users needs and TV’s capabilities. Most people would be just happy with the AV and continue using it in reality. There is a better one than S-video there too, the RGB, that wouldn’t be used for the same reason, not all TVs supported it. Perhaps with a custom SNES to 1084s cable or something. Nintendo cables/connections were custom almost always, so you had to pay a lot for them usually. E.g. there wasn’t a simple AV plug as most other systems have.
I think these debates about what was intended are a bit of a red hering, at least when you look at it system or even generation-wise and outside of strictly replicating certain time accurate experiences.
E.g. another way of saying that developers drew graphics with composite in mind is that they pandered to the lowest common denominator. This is usually considered a bad thing.
Of course in this context it’s not that easy, because you can get some nice effects that break with a better connection. Or you may just generally prefer a less sharp look, because your’re using a high TVL monitor etc., whatever.
But
A) That doesn’t negate the clarity you get with the better signals
B) There are thousands of games for the mainstream somewhat successful systems, exploiting RF/composite to varying degree.Sometimes non-existent, sometimes a lot. Although that’s very relative. Like, how much of Sonic is showing effects similar to the waterfall and rainbows in tubes?
Nostalgia aside, I wouldn’t consider it often to be very integral to the game experience if that was all. The most obvious clear cut example of a case for composite to me is when it’s the difference between having a monochrome picture and a picture with color, and that’s afaik only found in the computer world.
Which shows that they’re trying to spread awareness of superior video connections and encourage their use…?
It’s pretty obvious they just want to show their system can do a better signal if your set can do it. I don’t understand why we should do a debate about it. If it was the intended one they would have shipped it with that cable instead. It can also do RGB, why didn’t they ship it with that cable?
They didn’t just tell people to use S-Video. They were making users aware of the superior quality Video and Audio output options available as an alternative to RF, which many might have been using by default because that’s what they knew about.
Remember during the 70’s and 80’s Composite Video inputs were not yet so popular on TVs so people were “trained” to use an RF switch.
During the 90’s most Composite inputs on sets went unused even for cable TV boxes because of lack of knowledge by the wider public.
The same thing happened in the late 90’s when S-Video started showing up on more high end sets.
RGB (via VGA PC Input or Component Input) wasn’t a thing on consumer TVs until almost the EDTV/HDTV era.
I see the poster as more suggestive and informative as opposed to saying what was explicitly intended.
The Multi-Out cables were just a way that users could have gotten a much better quality experience over RF, provided that they knew and understood what they were for and they had a TV that was from the current era and not some old relic of the past.
The S-Video Multi-Out cable was a separate purchase as well so they were also marketing a more premium experience which they could also make some more money from users with premium TVs and the extra cash and means to get their hands on one of those S-Video cables.
Composite is underestimated, i have an a520 modulator with an Amiga 500 plus (it has RGB but tested for the sake of it) and it looks absolutely brilliant on CRT. Would even prefer it to RGB if i had to. All dithering is blended and it’s crazy sharp too.
If the system had a decent encoder (looking at Sega Genesis) it could produce a great image.
Genesis is an interesting example because it has the worst composite output but supports RGB with just a cable…
It’s like Sega is saying “Sure, you have your Sonic waterfalls and tubes and those are neat and all, but here’s RGB for a much better experience if you can afford it.”



