The raw palette is in the NES’s internal hue-level-emphasis format, in that order, normalized from 0 to 255. The NES PPU writes composite video directly by switching between just a few fixed voltage levels, without any concept of Y or C, so in order to convert that into YC space, you have to simulate the NES PPU’s composite video and decode it.
This article goes over the whole process with some sample code. https://www.nesdev.org/wiki/NTSC_video
For me, the easiest way to understand has been to read source code. These two programs generate only the colors, as opposed to a whole NTSC-filtered image. https://github.com/ChthonVII/gamutthingy https://github.com/Gumball2415/pally
Maybe a text summary would be better, though.
The level is an integer from 0 to 3. This selects a pair of voltages. The NES directly outputs a square wave that alternates between those two voltages. That affects both Y and the saturation of C.
The hue is an integer from 0 to 15. Hues 0 and 13 are grays, which use only one of the two voltages. Hues 1 through 12 output the square wave with one of 12 different phases, giving 12 different hue angles to pick from. If you pick hue 14 or 15, the result is always the same as picking level 1 and hue 13 together, even if level has been set to something other than 1.
Emphasis is 3 bits: One for red, one for green, and one for blue. Each emphasis bit corresponds to a specific twelfth of the chroma phase. If the emphasis bit is set to 1, that twelfth of the phase is attenuated down to a different, lower preset voltage. In effect, this causes some reduction of red, green, or blue. Emphasis is skipped if you set hue to 14 or 15.
The NES’s output isn’t shaped perfectly, however. For each increase in “level”, you get some shift in hue. Hues 2, 6, and 10 (if memory serves) have a hue shift too, as well as a slight increase in Y. The amount of hue or Y shift varies significantly between different revisions of the NES PPU.
Here’s something that those programs and that article don’t address, though I haven’t ruled out the possibility that my NES just isn’t working properly. On my own NES, choosing between RF or composite also has a significant effect. The NES is much sharper over composite than over RF, with composite having a grainy, sandpapery look, and RF having a more flat, blurry look. Connecting the NES’s composite into a VCR to convert into RF allows you to get the sharp, grainy look on an RF-only TV set. I need to check this again, but I believe I’ve noticed my NES having more saturated colors when connected over RF instead of composite. Don’t forget that the original Famicom in Japan had only RF (though it could be modded for composite), while the NES in the rest of the world had composite, so simulating both is necessary to reflect different developers’ intents.
Edit: I forgot to mention, in the past, I have also video-captured my NES’s palette, but this never turned out great. Decoding the NES’s colors into RGB results in some values becoming less than 0 or more than 255, which makes it impossible to convert them back to YUV/YIQ. That is why using FirebrandX’s Composite Direct palette (or any normal NES palette at all) with a composite video emulation shader does not work. In an attempt to fix this, I did my own NES video capture, with the capture’s black level increased and white level decreased, to keep all values between 0 and 255 without clamping, and in my shaders, I would perfectly undo that change. I have tried this several times, both on Ubuntu and on Windows, and the capture has had problems every single time, so I won’t link it anywhere. Therefore, for now, we’re stuck with emulated palettes, not video capture.