My shader is just doing composite for now. I removed the RF lowpass because it was working badly. I am experimenting with fast noise now, though. (Edit: I just thought, S-Video could take maybe 1 minute for me to add.)
There is a lot of reading that I need to do about adaptive comb filtering still. What I have implemented is only an educated guess, made by guessing several random ideas and seeing what sticks.
The code currently does a mix of notch and comb filtering. The idea is that the lowest frequencies of the signal contain very little chroma, so we can just notch filter that and get near-perfect luma there. The rest of the signal is then adaptively comb filtered, and it always tries to comb filter without falling back to a plain notch filter.
For my settings, “Sharpness” affects the strength of that luma notch filter. “Comb filter sharpness” directly multiplies the comb-filtered region by a factor. There are some other modes in there that I experimented with too, if you want to try them out, but I like the defaults that I set already. But, whatever you do, don’t demodulate at the narrowband bandwidth, as the baseband bandwidth is needed for the adaptive filter to compare lines effectively. (Edit: One of the modes allows you to compare the comb-filtered region of luma instead of the demodulated/lowpassed chroma, allowing you to use the narrowband bandwidth while still comb filtering kind of decently, but I didn’t like how it looked.)
From my experience, the way to figure things out are from chip datasheets, CRT service manuals, schematic diagrams, and, most importantly, actually getting real CRTs and consoles to try for myself. I have a Panasonic CT-36D30B from 2000 with an adaptive comb filter, with a Genesis, NES, PS1, Wii, and cheap composite-to-HDMI convertor for testing out the comb filter. (Edit: Occasionally, patents can be helpful. There are some research papers too that include measurements of CRTs.)