NesDev is the primarily source ofr NES measurements, yes.
seems “safe voltages” is the right fix for this
with safe voltages set to 2.0not fully fix it but it’s way better
Thanks @guest.r, all of these developments seem to further highlight the importance of “console specific” presets or parameter setups for those seeking higher levels of discernment.
I’ve always felt that NES and PS1 needed less added saturation than SNES and that if you used a preset Optimized for SNES, it would make NES colours oversaturated.
I hope you’re having fun with the new dials. Are we going to see safe voltage recommendations for other systems published as well, for example PC-Engine/TurboGrafx-16/Turbo Duo?
I wouldn’t want to use the wrong values arbitrarily.
With the previous implementation I was expecting an On/Off switch so when I saw a safe voltage desaturation knob, I wasn’t sure about what to so I first turned up to 1.0 and things looked unpleasantly desaturated and would have required an immense of NTSC Saturation…err…“colour boosting” to make it look right again.
I think I ended up settling on 0.50.
Random thought. An AGC circuit/switch/knob might be cool. Is this the same concept that we’re seeing implemented in the latest release?
Could you have gotten the same result by eyeballing the loss of detail and simply lowering the Saturation?
in fact the old fix was the opposite, which
BTW, for this I think it’s better if libretro output more metadata, especially since there are cases like this
AGC is tricky here, since “RF signal strength deviations” are pretty predictable with home devices (consoles, home computers). Some readings state, that it’s very usable for broadcast input sources.
I read about “circuits” which also affect luma strengths if input signal is too boosted, for example better Sony monitors. I found some old docs stating that safe voltage was regulated with Philips TVs and some ITU stuff.
The issue is also that today’s “standard CRT TV circuit solutions” didn’t apply to vintage engineers, they had the freedom to solve problems with different means and different companies did things differently. Some stuff was also patented (like delay line from SECAM, trinitron…), so there was motivation to innovate.
I would definitely be interested in taking a look at any links along these lines that people find, tho i do wonder if these deviations are due to decisions made in hardware design, or just aging hardware?
It’s hard not to notice the exact inverse correlation with how old the console is, with even the PS2 apparently getting closer to “correct” the newer the specific console being tested is, which is part of why i was looking for the source posts last night.
My suspicion is that the PS1 and PS2 in particular may have originally been more or less bang on, given that, at the time, the PS1 made waves due to actually being designed to adhere to display standards.
(For example, before the PS1, it is my understanding that the US versions of literally every single Japanese-designed console still used the NTSC-J 0 IRE black level, resulting in crushed blacks on essentially all non-NTSC-J NTSC displays, which were nominally factory calibrated for a black level of 7.5 IRE. Whereas the PS1 and PS2 actually had regional IRE.)
I’m a little far down the rabbit hole but I think AGC is the missing step here, as well. It’s mentioned here, IDK if any of it helps. AGC is a multi step thing and color got added back through various processes. “Chroma AGC,” “DC restoration,” “adaptive peak white mode”
https://www.analog.com/media/en/technical-documentation/data-sheets/max9526.pdf
https://www.analog.com/media/en/technical-documentation/data-sheets/adv7180.pdf
!Caution! ChatGPT:
1. AGC is a “soft correction,” not perfect restoration
- Chroma AGC operates on average . It measures the amplitude of the chroma signal (often via the color burst reference) and adjusts gain so that the mean chroma amplitude stays near nominal.
- It doesn’t correct each individual pixel or hue independently. Any local clipping, compression, or interference is not fully undone.
- Effectively, it prevents the overall image from looking washed-out or oversaturated, but it can’t recover color lost due to voltage limiting or bandwidth constraints .
It’s a part of RF / antenna circuits, mostly helping when signal levels are weak or maybe in some cases to strong. I could read that composite etc. inputs bypass AGC completely.
TODO with NTSC/PAL shaders:
Modulate into a single “vec3 dimesion”, like vec3(signal, 0.0, 0.0), if anyone is trying this concept, then it will bring all the “analogue quirks” with it and would require all the “TV” solutions.
Dunno, i might give it a try…
Yes, I think you’re right. I’ve encountered at least five different names for this function so far- a little confusing but they all seem to be describing the same thing. EDIT: I’m not sure, lol.
Is this a hallucination? Maybe we can get @anikom15 to weigh in
Burst AGC
Later — only on the chroma subcarrier path
Chroma amplifier gain
Burst AGC is distinct from the main AGC (Automatic Gain Control) , even though they sound similar and both are “gain regulators.”
Keeps color saturation constant regardless of transmission amplitude
Amplitude of the color burst on the back porch
Overview of Composite Decoder Signal Path (with Burst AGC)
- RF/IF Demodulator
- Converts the tuned broadcast signal (or input RF) down to baseband composite video.
- Output: composite waveform ~1 V p-p (sync tip at 0 V, white peak near +0.7 V).
- Video Amplifier / Clamp / DC Restore
- Stabilizes black level and overall voltage swing.
- Ensures the composite stays within “safe” 1 V p-p video levels.
- Sets the DC reference for later stages.
- Main AGC (Automatic Gain Control)
- Monitors sync tip or average luminance level.
- Adjusts IF/video amplifier gain to keep total composite amplitude constant, regardless of signal strength.
- Affects the entire video waveform (Y + C + sync).
- This is where the “NTSC safe voltage” regulation happens.
- Y/C Separation (Luminance / Chroma Filter)
- Splits the composite into: • Y (luma) → low-pass filtered (≈ 4.2 MHz) • C (chroma) → band-pass filtered (≈ 3.58 MHz ± 0.6 MHz)
- Chroma amplitude is now weaker due to filter losses.
- Chroma Pre-Amplifier (Fixed Gain)
- Applies a fixed boost (≈ +10 – 20 %) to restore chroma amplitude lost in filtering.
- Simple broadband amplifier centered on 3.58 MHz.
- Not adaptive.
- Burst AGC (Chroma Gain Control Loop)
- Samples the color burst on each line’s back porch.
- Compares burst amplitude to a reference (≈ 40 IRE p-p).
- Adjusts chroma amplifier gain to normalize color saturation.
- Output: chroma signal with stabilized amplitude for demodulation.
- Demodulator (I/Q or R–Y / B–Y)
- Uses a PLL locked to the burst phase.
- Converts the stabilized subcarrier into baseband color difference signals.
- Output: I/Q (NTSC) or U/V (PAL).
- Chroma Amplifier / User “Color” Control
- Final gain stage applied to the baseband color difference signals.
- Controlled by the front-panel “Color” knob (≈ 0.5× – 2×).
- May include minor shaping (e.g., I vs. Q bandwidth emphasis).
- Matrixing and RGB Drive
- Combines Y + scaled chroma (I/Q or R–Y/B–Y) into R, G, B.
- Final limiter or clamp ensures safe drive to CRT cathodes.
@guest.r thank you my man. Will try, will try
What you, the rest of the devs and even us the users ourselves are achieving is monumental. This community has created the digital CRT.
@guest.r I think for practical purposes the chroma pre-amp can be “rolled into” NTSC Saturation, mathematically it’s the same? I.e., no need for a separate parameter for stage 5. Stage 6 is what I’m curious about, some kind of adaptive chroma adjustment?
See, that’s the trick of it: it’s all a “hallucination”, some of it just may happen to line up with reality. But it can never be relied upon under any circumstance. Every detail, no matter how large or small, is suspect.
Yes, it should all be verified. The Gain Control stuff is immensely confusing, as there seem to be multiple settings with similar names performing the same function, but there also seems to be distinct functions with similar sounding names as well?
I think there’s maybe some conflation here. The AGC itself could use either the sync pulse or the colorburst for the overall signal amplification as both had stable levels. TVs could theoretically adjust saturation relative to the colorburst. I’m not sure how much they did that in practice. It’s an open question. The NES has a colorburst that is higher than the standard. That would result in a 25% reduction of chroma, a significant difference, if the TV adjusts chroma amplitude according to colorburst level.
I guess it has AGC properties, which could be very costly to reproduce with shaders/buffers, since we currently don’t have global frame or line variables, which could store peak values cheaply.
As I understand, at least an entire line must be evaluated. Luckily the emulated output is predictable and also adjustable, producing same results with less computing effort.
I mean, RA cores won’t degrade over time or do unpredictable “hot” chroma to the output buffer.
Ok, I guess we can move past this, now
Here are my current notes:
ntsc_iqdesat = “1.030000” Enough for visible effect, must be kept very low. Yellows become very desaturated. Is it because combined luma and chrominance (voltage) is too high? I.e., is it doing what it’s supposed to?
ntsc_taps = adjust as needed. Adjusts chroma blur and affects desaturation
ntsc_sat = 1.10 - 1.30. Consumer-grade chroma pre-amp
At this point it would be nice to have some real-world confirmation: did yellows become very unsaturated through RF?