Yes! Fixes all the issues I was seeing, thank you!
Just awesome
Yeah so when a gamut conversion is done the primary colors can only be simulated by this break-out (which makes sense). I noticed it myself too when doing a test case with red (an additional green virtual phosphor is lighted on a pure red screen, to make the DCI/2020 red more soft (like 709 red).
In itself there’s not much wrong with the break-out in virtual phosphors imo. The same would happen when (relatively) you would display a more yellowish green on a CRT than the native primary green of the CRT. I.e the virtual phosphor mask behaves like it should.
I can understand it becomes an issue when in your case you’re seeing greens (with the broken out virtual phosphors in green and with a slightly different hue than before or what you’re used to.
However a few thoughts come to mind:
- What is the correct green?
Ideally you would verify with a CRT of which you know by certainty that it is calibrated to exact the 709 colour gamut (phosphor primaries are exactly like what your matrix in the shader is based off) AND gamma is calibrated the same. That would eliminate the subjective factor and just stick with what’s real CRT green hue.
- I noticed that the color gamut conversion and the hue impression that is there at output is quite sensitive to gamma changes. So you may want to change gamma in and out a bit to see what it does to the hue.
In that regard there’s a more general thing on my mind with the gamma curves. Some thoughts coming up, would be interesting what your view is on the matter.
For the CRT era, one could discuss what gamma curve/function would best describe a CRT. This could be any of
- pure power law function
- 709
- sRGB
- Some totally whacked function because back in the day we would turn the “Contrast” knob on our TV’s and monitors probably a bit higher than the studio recommended “100 nits” (which is quite dim if you calibrate a CRT to that), and we would turn the “Brightness” to something that in daylight we could see our screens, totally whacking the “studio recommended” 2.2 gamma black level while being in dark room. I’m sure these analog Contrast and Brightness settings did not make for a clean single “gamma exponent” value change, but actually affected the curve function by quite a bit.
From my understanding sRGB calibrated curves is a thing from 1996 CRTs onwards. So what would best describe 80’s CRTs? Power law gamma?
Then on the output (display) side we also have a multitude of possible functions, especially with the addition of HDR.
The issue at hand is what’s best describing the CRT we’re simulating (what gamma curve and values) and what is the display device we’re using (what curve and values)? SDR led? HDR led? Chances are the CRT curve of the simulated CRT and the output curve (your monitor) will differ and ideally we would compensate for that difference.
Since we don’t know for sure what curve (power law, 709 or sRGB) would best describe the CRT we’re simulating, especially in regards to the wild west of the 80’s, I’m thinking it could make sense for the shader to make these functions separate from the colour gamut.
In the current shader setup when you choose 709 primaries colour gamut (which -could- be correct for 80’s CRT) then the 709 gamma curve is stitched to that. Same for sRGB. But maybe pure power law would better describe the 80s CRT?
Maybe ideally you can select a colour gamut and the gamma function seperately from each other. As this freedom would allow for possibly closer/better simulation of CRTs from all era (80’s through to 90’s).
My thoughts so far bring me to believe that for the gamma curves in the shader that are now stitched to the specific color spaces, it would be great to have them seperated, just that for example we could accommodate above case.
Ideally on the output side we could do the same. Currently power law gamma is stitched to the DCI space, while it could make a lot of sense to have sRGB curve in this case as possibly many current monitors have a “target” of DCI-color gamut (like e.g. 90% coverage or what not), but their gamma is actually better described by sRGB than by power law.
So the short story is whether it could make sense to make the gamma_in (CRT) curves and output curves seperate parameters, such that the flexibility of mix and match color spaces with gamma functions is provided for more flexibility to accurately match/simulate the wide range of CRTs out there on the wide range of (O)LEDs out there.
So for both the “Your Display” section and “CRT” section a “Gamma Type” could be introduced, that has the various gamma functions as a parameter broken out from the color spaces (but with same names for easy identification that “by default” they should be matched up):
So for the “YOUR DISPLAY’S SETTINGS:” it could be something like:
SDR: Display's Colour Space: r709 | sRGB | DCI-P3"
SDR: Gamma Type: r709 | sRGB | DCI-P3"
SDR: Gamma Value: (1.0 - 5.0)
and on the “CRT SETTINGS:” it could be something like
Colour System: r709 | PAL | NTSC-U | NTSC-J
Colour System Gamma Type: r709 | sRGB | Power Law
Colour System Gamma Value: (1.0 - 5.0)