Wii and composite cable. Not a real snes, though Snes9xGx is configured to output 240p. I doubt there’s a real difference here from a real snes.
I’d be interested in seeing the Sonic waterfalls if you’re using a NTSC Wii. On my PAL Wii, it’s definitely vastly cleaner over a real Genesis, but the PAL console outputs always PAL 60, which I assume also makes quite a difference. The Wii also always scales the output digitally into the 640px range, it works a bit differently compared to real consoles in this regard.
Probably Wii outputs 320p (or 640?) horizontally and stretches the snes 256 that’s why it needed the megadrive setting. And that rainbow wouldn’t be there on a real snes most probably as it outputs 256. Just a guess and I could be wrong.
You’re right! After reading this Thread I learned that the Wii doubles the horizontal resolution of 256x224 snes games before sending to a CRT. So, these rainbows were the result of sending a 512x224 or 640x224 resolution to my screen. It’s a collateral effect of not using a real hardware.
Yeah too many fine details that overlap with chroma causing rainbow effects.
Vague_Rant, who also made the Wii optimal video settings thread many years back, had a great article regarding Wii video on github which I can’t find anymore, but I saved it as pdf. This isn’t really directly related to composite though, but about the ratios, scaling and filtering. Downloadable here:
https://www.mediafire.com/file/opiglyi6gz1v0ee/Wii_Video_%C2%B7_GitHub.pdf/file
Another thing is that systems that use 1 and 3 phase tend to produce diagonal rainbows, unlike the vertical rainbows produced by the Genesis in 2 phase mode, which is something that both Sonkun and myself had observed in our own research and implemented in our presets.
I remember posting some screenshots a while ago, maybe 1 or 2 years ago.Time flies so fast in these forums.
I found the post I was talking about.
Here’s Genesis for comparison:
And here’s and example of what takes place on NES.
Some info about systems that can do artifact colors
Amiga: can do NTSC artifacts. The machine already had plenty of colors therefore the technique was not exploited. Plus, most of the users had RGB monitors. A possible exception is virtual karting 2.
Atari 8 bit: possible. 7.16mhz pixel clock on high-res (2x NTSC, 2 pixels share 1 color)
C64: the dot clock is not a multiple of the color clock. It is around 8.18Mhz for NTSC. This means that NTSC artifacts are visible using hires gfx but they will vary across the scanline and time, not usable. Other artifacts are were commonly used instead. Flickering and PAL delay line mixing.
ZX spectrum: pixel clock is 3.5 mhz, not a multiple of the color clock. Only the ZX spectrum 128 had same clock for color and pixels.
Zx81: it generates a color burst but the dot clock is related to the 3.25 MHz CPU clock.
Amstrad Cpc: the dot clock is 1mhz multiple therefore artifacts are not usable
Atari 2600: Each pixel is one ‘clock’ of the TIA’s processing time, and there are exactly 228 color clocks of TIA time on each scanline. But a scanline consists of not only the time it takes to scan the electron beam across the picture tube, but also the time it takes for the beam to return to the start of the next line (the horizontal blank, or retrace). Of the 228 color clocks, 160 are used to draw the pixels on the screen (giving us our maximum horizontal resolution of 160 pixels per line). No artifacts.
CGA: Possible. 14.31818MHz pixel clock max (4x NTSC)
Apple II, possible. the pixel clock is exactly twice the color clock. It has an option to turn off the color clock to generate reasonably crisp black-and-white text, then turn it on for the ability to generate artifact colors.
I general it’s possible if pixel clock is a multiple of the color clock (2x pixel clock, 2 pixels share 1 color etc). If pixel clock is 1x is not enough, each pixel has it’s own color. When pixel clock is not an exact multiple, it creates that rainbow artifacts
That’s a pretty big mistake on the ZX Spectrum design too, not using a 2x NTSC clock for pixels and create extra colors while it has a big problem with color attribute clash. A 14.316 master clock (the cheap crystal everyone was using) divide by 2 for dot clock and divide by 4 for CPU was the way to go.
This is kind of tangential, but I think this info could be helpful to document. I recently went through some pain to determine what level of display beyond RF video was supported by various consoles and to what degree this support was official. Some of it was from memory. For example, I distinctly remember seeing ads for official S-Video cables for the N64. But most of it required digging for sources amid a sea of popular mods to get RGB output from various consoles. I won’t be discussing every console, just the popular ones.
Before the NES, hardly anything supported composite (RCA) output. Everything connected to a television via RF. This required a modulator to convert the baseband video signal from the console to a radio signal. There were two termination methods. One was a 300 ohm twin-lead and the other was 75 ohm coaxial (the same used for modern television antennas). Twin-lead is the worst option, being sensitive to interference. None of these mechanisms are consistent. The resulting picture quality doesn’t just vary among consoles, but also regions, as different regions required different modulation techniques and usable channels.
Composite (RCA, A/V) video was adapted fairly quickly in Japan but took longer in the West. Ironically the NES included direct RCA ports for A/V, whereas the Famicom was RF only. Third-party models and the later Famicom revision in 1993 provided A/V output.
Consoles moved to using ‘multi-out’ ports in proprietary configurations. This allowed different outputs using one jack. For example, the Sega Master System included an RF unit in most configurations, but an optional A/V cable could be purchased for composite video. Some consoles, like the PC Engine and Mega Drive (Genesis) provided pins to access RGB signals. However, the RGB outputs on these devices are very poor and do not seem to be meant for consumer use (perhaps rather for debugging and development). Using the RGB pins on the PC Engine results in incorrect colors, while the pins on the Genesis will show jail bars. Cleaning up this output requires mods. However, there is one instance where Sega did provide an official RGB solution for the Genesis, and that was specifically for French Mega Drives where SCART was ubiquitous.
Sega Master System manual showing included and optional accessories
Official Video Monitor Cable for Mega Drive, included with Japanese Mega Drives, but an additional purchase for other markets (a stereo version was released for the Mega Drive 2)
Rare SCART cable for French Mega Drives (does not work on other Mega Drives without modification)
The final fourth generation, the Super Famicom (Super Nintendo), is the first to officially support S-Video. It also supports RGB properly. Here are the official cables that were sold:
P/N | Description |
---|---|
SNSP-003 | RF Switch |
SHVC-007 | Monaural A/V Cable |
SHVC-008 | Stereo A/V Cable |
SHVC-009 | S-Video (S-VHS) Cable |
SHVC-010 | RGB Cable (JP-21) |
SNSP-015 | Euro Connector Plug (Stereo A/V to SCART) |
It’s important to note that the RGB cable is not a SCART cable and will not work with SCART TVs. It is actually a JP-21 cable. The pins are different. The RGB cable is a rare cable that was only released in Japan. The ‘Euro Connector Plug’ is simply an RCA to SCART adapter and does not provide any better image than standard composite. For RGB output on a US or EU SNES, a third-party cable would have to be used. These cables were not really available or used when the SNES was popular (SCART was not even used in the US at all).
The Saturn and PlayStation officially supported RGB with JP-21 cables, again only available in Japan. The Nintendo 64 oddly did not have any support for RGB, only RF, composite, and S-Video. That said, even in Japan the JP-21 connectors were never very popular, and by the time the N64 came around it was essentially a failed technology.
Sony did release a SCART cable for the PlayStation (P/N SCPH-1052).
The Dreamcast was an odd-duck. While it did support S-Video, it also supported VGA.
By this time, component video was also becoming more common. In the US, RCA connectors were used, which created a Hydra-looking thing that was a pain to plug in. Japan used a new connector called D-Terminal. The PlayStation 2 was the first console to officially support component. With the Xbox 360, consoles moved to HDMI, and that has remained the standard since (the Wii was limited to component video, however).
Below is a table showing the various output methods on popular consoles. Any corrections or additions to this table are welcome.
Console | RF | Composite | S-Video | RGB JP-21 | RGB SCART | VGA | Component |
---|---|---|---|---|---|---|---|
Famicom/FDS | Yes | No | No | No | No | No | No |
New Famicom (Not FDS compatible) | Yes | Yes | No | No | No | No | No |
NES | Yes | Yes | No | No | No | No | No |
Master System | Yes | Yes | No | 3rd | 3rd | No | No |
Master System II | Yes | No | No | Mod | Mod | No | No |
PC Engine (PI-TG001) | Yes | Add-on | No | Mod | Mod | No | No |
PC Engine | Yes | Yes | No | Mod | Mod | No | No |
Mega Drive (J) / Genesis | Yes | Yes | No | 3rd | 3rd | No | No |
Mega Drive (PAL) | Yes | Yes | No | 3rd | Yes (France) | No | No |
Super Famicom | Yes | Yes | Yes | Yes | 3rd | No | No |
SNES | Yes | Yes | Yes | 3rd | 3rd | No | No |
Super Famicom Jr. / SNES Mini | Yes | Yes | Mod | Mod | Mod | No | No |
Saturn | Yes | Yes | Yes | Yes | 3rd | No | No |
PlayStation | Yes | Yes | Yes | Yes | Yes | No | No |
Nintendo 64 | Yes | Yes | Yes | Mod | Mod | No | No |
Dreamcast | Yes | Yes | Yes | No | 3rd | Yes | No |
PlayStation 2 | Yes | Yes | Yes | Yes | Yes | No | Yes |
GameCube (NTSC) | Yes | Yes | Yes | No | No | No | Yes |
GameCube (PAL) | Yes | Yes | No | No | Yes | No | Yes |
With mods, I would think you can add S-Video and RGB to to anything mainstream these days.
The GameCube has an official RGB cable available (Dol-013), probably to compensate for loss of S-Video on PAL Cubes.
PS2 had the VGA adapter from the Linux Kit, although I’m not sure if that really counts as “VGA compatible” because of the sync on green signal.
Early PC Engine (PI-TG001) is RF only.
Later models including PC Engine CoreGrafx (PI-TG3) support composite.
Even PI-TG001 can output composite via the expansion bus when combined with a CD-ROM2 unit.
Also, being a Japanese site, it would be quite a challenge to browse while translating, but I believe this site is valuable in terms of information in terms of output.
Thanks for that link. I can read Japanese. The Wonder Mega supports S-Video, which is interesting.
Currently, Retroarch has a wide variety of NTSC shaders and presets, including RF/Composite/S-Video, which can be combined in various ways to achieve the image quality I want.
However, there is one problem.
That is Component. In Japan, it is called D-Video, which was used in PS1/PS2/PSP/Wii. early GC models also had it, but I doubt if the internal signal is D-Video or not.
Now, how does Component finish the NTSC shader parameters, I would love to see Component added to the NTSC presets.
Or is Component something that is not in demand?
It seems to look different from RGB.
I figured Component or D-Video because the three colors are physically independent.
I thought that simply setting “NTSC Blend Mode” to 0 in the NTSC shader would make it more like Component or D-Video.
Component definitely looks different than RGB, as it’s a different color space. (Besides color it’s largely identical tho)
Does it have a relation to NTSC though. Quality wise the differences between it and standard RGB are usually considered to be marginal at best. I’ve run component through a HDMI-VGA chain, that didn’t seem to degrade the signal very notably.
Honestly think it’s completely unrelated to NTSC but 🤷.
And yea, agree, quality wise, RGB = Component (if that’s what you’re saying, idk it’s been a long day don’t @ me )
I see now that I am on the wrong topic.
Component/YPbPr is a relative of RGB. It has one line for luma/Y, and then a line each for the two chroma components, which are calculated as blue minus luma (this is the ‘Pb’) and red minus luma (this is the ‘Pr’).
It never undergoes NTSC/PAL de/modulation, so it doesn’t get artifacts or color changes from that, and there’s no crosstalk among the signals due to dedicated cables.
Also, RGB usually has H/V or C sync signals in dedicated lines, while YPbPr carries sync on the green luma cable (or, in all 3 cables in HD signals, apparently).
A fun bit of backwards compatibility built into YPbPr is that you can plug the luma cable into any TV that supports composite video (as long as the resolution is still compatible) and get a super-sharp black-and-white image, even on crappy TVs.
Thanks for the detailed explanation. However, the situation is a little different for what is called D-Video or D-Terminal. This one has inferior image quality compared to Component.
Japanese Wikipedia says “Some say that the picture quality is slightly inferior to the Component terminal due to the structure of the connector (e.g., lack of alignment at the connection, inability to maintain shielding of signal lines).” I am of the opinion that it is blurred more than RGB and appears clearer than S-Video.
In Japan, there is a majority opinion that Component is clearer than D-Video or D-Terminal.
However, since D-Video or D-Terminal itself is a Japan-specific standard, I give up the pursuit.
It would be nice if we could blend like NTSC shaders with RGB/YPbPr colors.