That’s unexpected, actually it should be otherwise. Could you test without the scanlines (only grade)?
I didn’t do any changes to the phosphors gamut but it might behave differently due to the fixed rolled_gain. I will test tomorrow.
That’s unexpected, actually it should be otherwise. Could you test without the scanlines (only grade)?
I didn’t do any changes to the phosphors gamut but it might behave differently due to the fixed rolled_gain. I will test tomorrow.
This is with your settings.
If you mean the blue bar, that’s the effect of adding blue (D65) to something already blue -> out of gamut and clipping away values. Something smarter like perceptual intent or CIECAM should be desired but I’m still studying that.
One option is to reduce Hue vs SAT Blue to -0.17.
Thanks for checking that.
Another option, I found, is to lower white-blue a bit.
Everything looks great, otherwise
@Nesguy did you check with the latest shader from Dogway’s repo , or the one from the buildbot / libretro git? His repo is the one up-to-date with the latest changes.
I find this very interesting, do you have an idea what blue primary CIE coordinate would mitigate this?
Would a monitor which supports Rec.2020 solve this? *** Or would a monitor that’s capable by it’s native gamut to reach more saturated blue by a little already be enough? If so what coordinate should we try to get a reliable calibration for?
*** I’m seeing on the Rec.2020 wiki that it has the wider gamut on blue.
The other “regular” gamuts (sRGB, DCI, Rec.709) seem to have the same blue value, even if in the case of DCI they’re advertised as wide gamut… hmmm… which apparently is not true for blue and/or sufficient for “CRT blue”?
I was just using the “update shaders” option in Retroarch. I’ll check out the latest version on Github.
Yeah Nesguy is familiar with his GitHub lol.
Help is always appreciated though.
I really need to test this out but the lazy is real right now. House is too hot.
The problem with White-Blue is that it is generalized, it will lower blue everywhere that is not a primary. HUE vs SAT blue is more akin to CIECAM behaviour.
I’m trying to settle down all my updates before I push a PR to RA repo. So far I’m happy with my latest updates but if you have any issues let me know so I can have a look before doing so.
@rafan, I recall mentioning this, but almost all gamuts are anchored at blue primary. Rec.2020 is a virtual gamut in the sense no display can show 100% Rec.2020. The best course of action here is a smart gamut mapping. I think I got it, but haven’t tested enough the last days.
Any idea why these gamuts are anchored at this blue then (since it seems to fall short when compared to CRT)? I would assume with the transition to the sRGB standard this would have been done carefully as to not loose compatibility with the >1 billion of CRT monitor’s out there? Or am I missing the clue here?
Ah, didn’t realize 2020 isn’t available yet on any screens. It would be nice to see whether blue would show without clipping on a real late 90’s Sony BVM CRT. It should I guess, but then I wonder what blue phospor those things were produced with.
Can you provide some more information on what you intend to do with smart gamut mapping?
I don’t know but probably related to the chemical properties of the material required for LED phosphors. Brands are having a hard time to even produce a Rec.2020 compliant display, that can give you a clue why it’s not an easy task.
I guess a good comparison could be a Japanese BVM tuned to D93, then pass the test suite there.
What I’m trying to do is a rendering intent in ICC terms or a chromatic adaptation in CIE terms. This is so all colors fit into sRGB gamut in a smart way, so it shows no clipping like shown above.
@Dogway I’m noticing the “Blue-Green Tint” adjustment is what I like with almost all the preset CRT gamuts in grade, so I’m just curious whether it would result in same (or better) picture when the “-0.10” Blue-Green Tint adjustment is done through direct “correction” of one of the CRT gamuts. Could you comment on what this “Blue-Green Tint” adjustment of -0.10 is actually doing in context of the CIE primary coordinates? (And as such whether I could also achieve this Tint adjustment through a modified/custom CRT gamut?)
Do you mean the HUE vs SAT blue? I never recommended that value for Blue-Green Tint but rather an increase +0.10, that means adding Green to Blue so the sky is Azure Blue and not Deep Blue. This is what happens when you first use a CRT gamut, and later color manage RA to your display which you can do now with a Reshade LUT (I just updated grade to master repo 2 days ago).
In other words, if you color manage RA with a LUT you don’t need the Blue-Green Tint, but you still need to lower HUE vs SAT blue because we are adding too much blue -out of gamut- with a D93 temperature.
Thanks for the answer, but I’m not touching the Hue vs Sat. I’m only changing “Blue-Green Tint”, the parameter as it is called in your shader:
#pragma parameter bg “Blue-Green Tint” 0.0 -1.0 1.0 0.005
Could you comment on what this “Blue-Green Tint” adjustment (of -0.10 in my case) is actually doing in context of the CIE primary coordinates? And as such whether I could also achieve this Tint adjustment through a modified/custom CRT gamut?
EDIT: I’m asking the above for the case of -NOT- color managing RA. Just plain vanilla grade shader on a good (say deltaE <3) factory calibrated sRGB monitor.
EDIT2: I’m also touching “Green-Blue Tint” and “Blue-Red Tint” a bit, but I don’t want to make my above question too complicated. For my understanding, I’m mostly interested in whether the “Blue-Green Tint” parameter as per your shader, what a value of -0.10 means in the context of CIE primaries (is it a “lowering” of both blue and green primary coordinates?)
No it’s altering how much green is in the blue channel.
Yes that is what I observe, but my question is how does that relate to the CIE chromaticity primaries?
It seems I’m not making myself clear enough. Is setting “Blue-Green Tint” bg parameter to -0.10 correlated to a shift in the color gamut “triangle”, yes or no? And if yes, what shift in the B and G CIE chromaticity primary coordinates does it correlate to? Or is this doing something totally different and not related to the CRT gamut?
That’s indeed the triangle I mean. So from your answer I understand that setting “Blue-Green Tint” bg parameter to -0.10 correlates to a shift in the color gamut “triangle”, more precisely correlates to a downshift in G and B coordinates?
Moving “Blue-Green Tint” bg parameter to -0.10 doesn’t seem to introduce clipping, so I’m not quite understanding why changing the chromaticity values for G and B would? (I understand theoretically for Blue it would be the case, but see below…)
Or is it actually a case of where only the green corner schrinks? I.e. would moving “Blue-Green Tint” bg parameter to -0.10 only correlate to a lowering of the “G” coordinate: the gamut gets less saturated in green, so all values that are a mixture of green and blue get less saturated with green? Would that be the right analogy?
Edit:shrinking only green would also affect the green saturation for all values that are a mixture of red and green, which "Blue-Green Tint” bg parameter to -0.10 doesn’t seem to do (at least not very noticeably). So I guess, I still don’t understand how lowering of “Blue-Green Tint” relates to the chromaticity diagram? Any help in understanding this relation would be great.
Edit 2: I’m aware that the triangle is a 2D simplification of a 3D gamut space (because of mixing with black and the whitepoint), just that you know.
I think you are only changing the blue primary coordinates in the direction of the blue-green vector (arrow). Green and red primaries stay at place.
Edit2: And that’s why a LUT is better than a simple matrix.
OK thanks, so then we arrive at the “problem” that even wide gamut (DCI, Adobe) has blue fixed at the sRGB primary of 0.15, 0.06? I.e. even with "wide"gamut monitors we’ll clip in blue if we change the CRT gamut (grade shader) to use any blue values more saturated than sRGB blue?
Edit: I think one of the caveats with the LUT is that with DisplayCal you need to set a panel technology type, which for some laptops or even desktop monitors is not known with certainty. Which makes it a trial and error procedure, where you may end up with a result where the calibration is less accurate than uncalibrated, because the calibration was done with a “wrong” panel type selection.
Edit 2: I’m looking into the various monitor options, so I was contemplating the LG-27GL850 . But from reading on the DisplayCal forums, it uses nano-IPS, for which there are NO correction curves for DisplayCal available. Which in other words means this panel can’t be usefully calibrated through displayCAL. At least as far as my understanding goes.