NES NTSC Color Decoding Shader

I was able to port Bisqwit’s NES palette generator to a shader, utilizing the the ‘raw’ palette in Nestopia and FCEUmm cores that encodes the NES PPU’s raw chroma, level, and emphasis bits into a RGB pixel. If you use any other palette beside ‘raw’, it will not work.

I ported the shader to each of the shader formats supported by RetroArch:

Cg: https://github.com/libretro/common-shaders/blob/master/misc/nes-color-decoder.cg GLSL: https://github.com/hizzlekizzle/glsl-shaders/blob/master/misc/nes-color-decoder.glsl Slang: https://github.com/libretro/slang-shaders/blob/master/misc/nes-color-decoder.slang

The are three versions of the shader in each repo that have different YIQ to RGB conversion matrices that can result in different color outputs. The different YIQ to RGB conversion matrices are now included in a single shader, with a runtime parameter to toggle the Sony matrix.

See: http://hitmen.c02.at/temp/palstuff/ under YIQ (NTSC) for the conversion matrices used. The non-Sony ones seem to generate almost the same colors in most instances but I included both anyway.

I included presets for this shader in the cgp directory (or presets directory in slang), of which nes-color-decoder+pixellate should provide the simplest image.

The shader has parameters to adjust saturation, hue, brightness, contrast, and gamma so you will be able to adjust the color output to how your TV is configured. Keep in mind that there are some colors that can be shown on real hardware that will get clipped on sRGB at higher saturation, particularly the blue colors. There is a parameter to change the method by which out-of-gamut colors are clipped.

Screenshots:

Input with ‘raw’ palette:

nes-color-decoder output (FCC matrix):

nes-color-decoder output (Sony CXA2025AS US matrix)

nes-color-decoder output (Popular non-FCC matrix)

2 Likes

Great work, man! I really dig the colors on the non-Sony variants.

1 Like

Nice! Thanks for sharing :slight_smile:

Interesting. Is there an advantage to this approach over setting a palette in Nestopia? Or is it primarily for ease of creating custom palettes?

It results in a calculated palette rather than an eyeballed or captured one (i.e., objective rather than subjective). Bisqwit is a bit of a wizard at this sort of thing, too, so his calculations are typically spot-on.

The Sony decoder might be kind of hue shifted, because I had to set the hue to -15° in the shader to get it to look like Nestopia’s Consumer decoding preset. I’m not sure why that is.

The shader now supports switching to the Sony matrix with a runtime parameter. Also added different clipping methods with a runtime parameter to switch between them.

I also ported a seperate colorimetry shader, taken from Drag’s NES Palette Generator, that can be added as a second pass. This shader lets you simulate a different screen colorimetry, such as the 1953 FCC standard, and also lets you switch between D93 and D65 white point, and has a clipping method parameter as well.

Cg: https://github.com/libretro/common-shaders/blob/master/misc/colorimetry.cg GLSL: https://github.com/libretro/glsl-shaders/blob/master/misc/colorimetry.glsl Slang: https://github.com/libretro/slang-shaders/blob/master/misc/colorimetry.slang

In both shaders, the clipping method parameter is the following: 0 = Clamp 1 = Darken 2 = Desaturate

In the colorimetry shader, the colorimetry mode parameter is the following: 0 = FCC (1953) 1 = SMPTE C (1987) 2 = sRGB

Default ntsc-color-decoder settings with FCC (1953) colorimetry and D93 white point:

2 Likes