Dogway's grading shader (slang)

Any idea why these gamuts are anchored at this blue then (since it seems to fall short when compared to CRT)? I would assume with the transition to the sRGB standard this would have been done carefully as to not loose compatibility with the >1 billion of CRT monitor’s out there? Or am I missing the clue here?

Ah, didn’t realize 2020 isn’t available yet on any screens. It would be nice to see whether blue would show without clipping on a real late 90’s Sony BVM CRT. It should I guess, but then I wonder what blue phospor those things were produced with.

Can you provide some more information on what you intend to do with smart gamut mapping?

2 Likes

I don’t know but probably related to the chemical properties of the material required for LED phosphors. Brands are having a hard time to even produce a Rec.2020 compliant display, that can give you a clue why it’s not an easy task.

I guess a good comparison could be a Japanese BVM tuned to D93, then pass the test suite there.

What I’m trying to do is a rendering intent in ICC terms or a chromatic adaptation in CIE terms. This is so all colors fit into sRGB gamut in a smart way, so it shows no clipping like shown above.

1 Like

@Dogway I’m noticing the “Blue-Green Tint” adjustment is what I like with almost all the preset CRT gamuts in grade, so I’m just curious whether it would result in same (or better) picture when the “-0.10” Blue-Green Tint adjustment is done through direct “correction” of one of the CRT gamuts. Could you comment on what this “Blue-Green Tint” adjustment of -0.10 is actually doing in context of the CIE primary coordinates? (And as such whether I could also achieve this Tint adjustment through a modified/custom CRT gamut?)

1 Like

Do you mean the HUE vs SAT blue? I never recommended that value for Blue-Green Tint but rather an increase +0.10, that means adding Green to Blue so the sky is Azure Blue and not Deep Blue. This is what happens when you first use a CRT gamut, and later color manage RA to your display which you can do now with a Reshade LUT (I just updated grade to master repo 2 days ago).

In other words, if you color manage RA with a LUT you don’t need the Blue-Green Tint, but you still need to lower HUE vs SAT blue because we are adding too much blue -out of gamut- with a D93 temperature.

2 Likes

Thanks for the answer, but I’m not touching the Hue vs Sat. I’m only changing “Blue-Green Tint”, the parameter as it is called in your shader:

#pragma parameter bg “Blue-Green Tint” 0.0 -1.0 1.0 0.005

Could you comment on what this “Blue-Green Tint” adjustment (of -0.10 in my case) is actually doing in context of the CIE primary coordinates? And as such whether I could also achieve this Tint adjustment through a modified/custom CRT gamut?

EDIT: I’m asking the above for the case of -NOT- color managing RA. Just plain vanilla grade shader on a good (say deltaE <3) factory calibrated sRGB monitor.

EDIT2: I’m also touching “Green-Blue Tint” and “Blue-Red Tint” a bit, but I don’t want to make my above question too complicated. For my understanding, I’m mostly interested in whether the “Blue-Green Tint” parameter as per your shader, what a value of -0.10 means in the context of CIE primaries (is it a “lowering” of both blue and green primary coordinates?)

No it’s altering how much green is in the blue channel.

Yes that is what I observe, but my question is how does that relate to the CIE chromaticity primaries?

It seems I’m not making myself clear enough. Is setting “Blue-Green Tint” bg parameter to -0.10 correlated to a shift in the color gamut “triangle”, yes or no? And if yes, what shift in the B and G CIE chromaticity primary coordinates does it correlate to? Or is this doing something totally different and not related to the CRT gamut?

You mean this? You’ll clip tho:

That’s indeed the triangle I mean. So from your answer I understand that setting “Blue-Green Tint” bg parameter to -0.10 correlates to a shift in the color gamut “triangle”, more precisely correlates to a downshift in G and B coordinates?

Moving “Blue-Green Tint” bg parameter to -0.10 doesn’t seem to introduce clipping, so I’m not quite understanding why changing the chromaticity values for G and B would? (I understand theoretically for Blue it would be the case, but see below…)

Or is it actually a case of where only the green corner schrinks? I.e. would moving “Blue-Green Tint” bg parameter to -0.10 only correlate to a lowering of the “G” coordinate: the gamut gets less saturated in green, so all values that are a mixture of green and blue get less saturated with green? Would that be the right analogy?

Edit:shrinking only green would also affect the green saturation for all values that are a mixture of red and green, which "Blue-Green Tint” bg parameter to -0.10 doesn’t seem to do (at least not very noticeably). So I guess, I still don’t understand how lowering of “Blue-Green Tint” relates to the chromaticity diagram? Any help in understanding this relation would be great.

Edit 2: I’m aware that the triangle is a 2D simplification of a 3D gamut space (because of mixing with black and the whitepoint), just that you know.

I think you are only changing the blue primary coordinates in the direction of the blue-green vector (arrow). Green and red primaries stay at place.

Edit2: And that’s why a LUT is better than a simple matrix.

1 Like

OK thanks, so then we arrive at the “problem” that even wide gamut (DCI, Adobe) has blue fixed at the sRGB primary of 0.15, 0.06? I.e. even with "wide"gamut monitors we’ll clip in blue if we change the CRT gamut (grade shader) to use any blue values more saturated than sRGB blue? :frowning:

Edit: I think one of the caveats with the LUT is that with DisplayCal you need to set a panel technology type, which for some laptops or even desktop monitors is not known with certainty. Which makes it a trial and error procedure, where you may end up with a result where the calibration is less accurate than uncalibrated, because the calibration was done with a “wrong” panel type selection.

Edit 2: I’m looking into the various monitor options, so I was contemplating the LG-27GL850 . But from reading on the DisplayCal forums, it uses nano-IPS, for which there are NO correction curves for DisplayCal available. Which in other words means this panel can’t be usefully calibrated through displayCAL. At least as far as my understanding goes.

That’s why my interest in gamut mapping. You compress values to keep the perceptual relative saturation (hue aligned), instead of clipping. This might work better for highly saturated sources like games, but it’s hard maths and totally undocumented (math wise). I never studied maths after school a long time ago so for me it’s hard to guess how I intersect a plane with a 3D gamut and things like that.

The wikipedia link I shared falls short just before explaining the compression algorithm. Here, an AMD brochure, you have a very graphical explanation of how compression is performed but no code or maths.

One option would be to create a LUT to convert from a specific gamut (like a CRT gamut) to sRGB. DisplayCAL (Argyll indeed) uses CIECAM02 I think, so that’s an option, I just haven’t delved into how to use an user predefined characterization.

On the correction files I really (and most people) don’t fuss much about. For the same reason that not all devices are the same. There are worse offenders like surround luminance.

I read a few raising questions on the color temperature of grade, so I went ahead and added a mathematical accurate function in CIE xy.

Colors are now more accurate as I could check, but still for a no-op use D55 as D65 is slightly blue. I don’t want to add source-target color temperature conversion since it’s a lot of code but I guess we can subtract the difference for our target temperature. That is if we want a temperature of D93 (8942K), then use 7942K. Do you think I should perform this under the hood or what’s your opinion?

Updated grade is in official slang repo.

2 Likes

IMO it’s a good idea for this to be done under the hood since the user probably doesn’t know that they should subtract the difference for the target temperature.

2 Likes

Ok, this changes the current behavior at least for you guys that are up to date, but I will assume the content (games) are D65 so this should better reflect the target temperature, as I could test it’s a bit more neutral (less blue).

I feel bad for hunterk as I said I wasn’t going to do many changes >.<

2 Likes

Hi @Dogway, I’m working the grade shader into the mega bezel and I wanted to know which LUT files are supposed to be used for LUT1 and LUT2?

2 Likes

Anyone that matches the LUT size is fine, my LUT size defaults are 16 and 64, so for simplicity I use “reshade\shaders\LUT\16.png” and “reshade\shaders\LUT\64.png” identity LUTs as no-ops.

2 Likes

Sort of on topic:

Is there a way to do gamma correction as the very last step, after scanlines, mask, glow etc?

All of these things alter the gamma so if we do gamma correction first we don’t have the same gamma after applying scanlines etc. Wouldn’t it make sense to do all the other stuff first, then correct gamma?

Yeah, you’d either have to add gamma correction to the final pass or add another pass at the end for gamma correction.

Makes the most sense to incorporate it into the last pass though.

You mean scanline dynamics affect gamma? well that’s what’s supposed to do I guess. You can compensate using grade’s gamma, after all gammas are additive, so no matter in what position you place them unless you need an operation in a certain gamma space.

I’m emulating signal gamma and that as far I’m concerned happens before scanlines so it’s wise to keep some parity with what the hardware does.