Dogway's grading shader (slang)

So is CRT gamma more accurately described as “CRT signal gamma?”

Yes, I would think this “signal gamma” is what brands cranked up to compensate for the dim/dark surround, as 2.222 is more for daylight. As far as I’m concerned scanlines is something that happens very close to the end of the chain so it would go after this signal gamma.

I want to take also the opportunity to announce here a video guide I just made about Calibration and Color Management. As you might already know I do Computer Graphics and develop tools. The video guide includes a calibration and profile tutorial for DisplayCAL, as well as LUT authoring for several CG packages, but also other media apps like Retroarch, OBS, and so on. If you own a colorimeter and are interested PM me and I can provide a 33% discount code at gumroad.

4 Likes

@Dogway I was reading the Poynton paper “The rehabilitation of gamma” here https://poynton.ca/PDFs/Rehabilitation_of_gamma.pdf .

It’s in interesting read (18 pages of high level stuff on gamma alone) even though most of it is above my level. I was especially interested by two remarks about misconception vs fact, see page 2 of the paper and quoted below

Misconception: A CRT is characterized by a power function that relates luminance L to voltage V’: L=(V’)γ.

Fact: A CRT is characterized by a power function, but including a black-level offset term: L=(V’ +ε)γ. Usually, γ has a value quite close to 2.5; if you’re limited to a single-parameter model, L=(V’ +ε)2.5 is much better than L=(V’)γ.

Misconception: The exponent γ varies anywhere from about 1.4 to 3.5.

Fact: The exponent itself varies over a rather narrow range, about 2.35 to 2.55. The alleged wide variation comes from variation in offset term of the equation, not the exponent: Wide variation is due to failure to correctly set the black level.

So I was wondering about two things:

  1. whether Grade already accounts for the “black-level offset term” in the gamma equation as above, or whether it would be useful to do so?
  2. A CRT when properly adjusted has a black level of about 0.01 cd/m2, whereas a properly adjusted IPS LED panel has a black level of about 0.30 cd/m2 (or worse): 30 times as high. Could this higher black level of the IPS panel be accounted for in grade’s gamma function by using above black-level offset term?

Hopefully you can shed some light on this :slight_smile:

3 Likes

Thanks for the paper. There are a few things a bit ambiguous to me but that must be because the paper is a bit old (circa 1998).

It took me a while to grasp but the whole point of the paper is about that “offset” or “black level” from where a power law gamma is described. We already account for this as we are aware that CRTs didn’t employ a power law gamma function but a “linear segment towards black”. This is already done in the moncurve() functions in grade.

They also talk about a gamma of 2.35-2.55 instead of a theoretical 2.222, that’s also my recommendation in the grade presets, but as the paper indicates it has to do with implications of the viewing surround which were typically dim.

There are other black level offsets in the pipeline. There’s the black level pedestal of 7.5 IRE, but that only happens for NTSC-U and it’s invisible to the user.

Then there’s the exposed black level which is a bit more artistic, here you can emulate the CRT brightness of 0.01 cd/m2, but to be fair that’s at signal level. CRT displays are way more reflective than current LEDs as we discussed recently, that’s why they look greyish even when turned off. Link

2 Likes

Thanks for the insight, good to know grade already incorporates the adjustments to the gamma curve with the linear segment towards black.

It’s quite interesting to see how important gamma is to correct color reproduction. I was thinking of reproducing a vintage CRT’s gamma of 2.5 theoretically with the current gamma pipeline, but then I realized I don’t quite understand the complete pipeline. Hopefully you can shed some light on this. Purpusedly I’m using a RGB intensity value below to see what goes in and what comes out.

Emulator gamma pipeline

  1. BSNES core outputs raw RGB pixel value (from range 0-255) : R(ed)=128
  2. Dogway’s Grade encodes value with CRT gamma 2.5: R(ed) = ((128 / 255) ^ (1 / 2.5)) * 255 = 194
  3. guest-dr-venom gamma_input DECODES it with gamma 2.4: R(ed) = ((194 / 255) ^ (2.4)) * 255 = 132
  4. guest-dr-venom gamma_out encodes it with gamma 2.4: R(ed) = ((132 / 255) ^ (1 / 2.4)) * 255 = 194
  5. Retroarch graphics API outputs value for videocard. Retroarch gfx api encodes it with gamma 2.2 : R(ed) = ((194 / 255) ^ (1 / 2.2)) * 255 = 225
  6. My PC attached IPS monitor decodes value with gamma 2.33 (DisplayCAL measured): R(ed) = ((225 / 255) ^ (2.33)) * 255 = 190

So for my pipeline with Grade gamma at 2.5 an RGB pixel intensity value of R(ed)=128 goes in and an RGB pixel intensity value of R(ed)=190 gets sent to my eyes.

Real SNES hardware gamma pipeline

So what happens with a real SNES connected to vintage CRT? Does the SNES do gamma encoding or not? Both cases below:

SNES does gamma encoding?

  1. SNES encodes R(ed)=128 with gamma 2.5: R(ed) = ((128 / 255) ^ (1 / 2.5)) * 255 = 194
  2. Vintage CRT decodes it with gamma 2.5: R(ed) = ((194 / 255) ^ (2.5)) * 255 = 128

SNES outputs raw RGB intensity values (NO gamma encoding)

  1. SNES outputs R(ed)=128
  2. Vintage CRT decodes it with gamma 2.5: R(ed) = ((128 / 255) ^ (2.5)) * 255 = 46

Mismatch emulator output and real SNES:

So for a real SNES connected to a vintage CRT (gamma=2.5) when the SNES ouputs an intensity value for a red pixel of 128, then what gets displayed on the vintage CRT monitor is an intensity value of either 128 or 46, depending on whether a real SNES does gamma encoding of the output signal or not.

In the emulator gamma pipeline when the RA Bsnes core outputs an intensity value for a red pixel of 128 and the Grade shader CRT output is set to gamma=2.5 then an intensity value of 190 is displayed on the LED monitor. A large discrepancy versus real SNES hardware connected to a vintage CRT monitor!

@Dogway Since the discrepancy between the input and output for both pipelines is so large I’m sure the pipelines will very probably be different from what is described above.

Purposedly this is a starting point to get to the real answer to both pipelines.

My question is how the real pipelines for both the “Emulator gamma pipeline” and “Real SNES hardware gamma pipeline” are. Could you possibly take both pipelines depicted above and rearrange / add / remove steps where necessary, such that we get to a true picture of both pipelines?

1 Like

In your examples you are using power law gamma operations but as I said grade uses “sRGB” type gamma functions with a linear segment towards black. I do several operations for gamma because no developer could confirm to me why the emulators are not outputing RAW values and instead rely on a gamma encoded output. For this reason I assume the output is already sRGB gamma encoded, so the first step in grade is to linearize that with the inverse operation. Once linearized I apply a SMPTE gamma function with a gamma_in value. In your case 2.5. This is now our CRT ground truth. We relinearize it with the sRGB inverse function, do all our grade operations and output with the color_space related gamma, if sRGB it will do a gamma cancel and get a match for the CRT ground truth.

If you want to get the correct gamma output from Retroarch to match your calibrated display you have to color manage retroarch. For that use DisplayCAL and use the Reshade LUT output, you can load that in grade. (Or calibrate your display to 2.2 gamma)

The snes has to encode gamma for the composite signal, it uses the SMPTE gamma function.

2 Likes

Thanks for the explanation, I think I understand the proces now:

  1. bsnes core output is assumed to be encoded with srgb gamma (2.2)
  2. grade shader decodes output from 1 with inverse operation: we have RAW values now
  3. RAW values are encoded by grade with SMPTE gamma function (2.5 in my case)
  4. grade linearizes again with sRGB inverse operation, does grade operations, and outputs/encodes to target space, which if sRGB cancels out the earlier gamma operation (output values of bullet 3. are maintained)
  5. LED monitor decodes with sRGB gamma 2.2 (calibrated)

So in summary, if we strip out all canceling encoding/decoding steps :

  1. effectively RAW values are encoded by grade with SMPTE gamma function and value of 2.5 (in my case; a setting between 2.35 and 2.55 is recommended in grade shader guide)
  2. LED monitor decodes with sRGB gamma 2.2 (calibrated)

Compare this to real SNES hardware connected to a vintage CRT, as that will most likely look like:

  1. RAW values are encoded by SNES hardware with gamma of 2.35 - 2.55 (is the real value known?).
  2. Vintage CRT decodes with gamma of 2.35 - 2.55 (the exact value will differ per TV set)

Step 1 is comparable between emulation and reaI hardware.

But I see an issue with step 2.: a vintage CRT decodes with gamma between 2.35 - 2.55 while in the emulated setup the LED monitor decodes at 2.2 (calibrated).

So what am I overlooking in this?

1 Like

Yes, gamma is a complex topic, look at this paper to know all the typical different gammas.

The SNES has to gamma encode the signal before modulation. I just read that gamma correction hardware is/was very expensive, so my guess is that the games are already gamma corrected. That would change things a bit, but in any case I opted to keep things open.

On paper the signal is encoded with 2.222 SMPTE gamma to compensate for the CRT 2.2 calibrated gamma. In reality the CRTs were more about 2.5 gamma and the signal at least for SNES games were authored with 2.5 gamma (?).

I chose to keep the encoding-decoding within a single function (gamma_in) so it is more user friendly. In the end it is very similar to your first bullet point for the SNES hardware, probably more towards 2.35 gamma, but who knows.

On your last issue, it’s not an issue at all. If you take a greyscale ramp with gamma 2.2 and raise the gamma, the view referred greyscale gamma is now not canceled at all regardless of monitor gamma, you are leaving the offset which is what we care about.

I might try the above things (gamma assumed) but I’m a bit busy lately.

1 Like

I had a similar thought :slight_smile: . In the poynton paper mentionded earlier, there’s this interesting quote:

“A CRT’s response is very nearly the inverse of the lightness sensitivity of vision: <…>The fact that a CRT’s transfer function is very nearly the inverse of the lightness sensitivity of vision is an amazing, and fortunate, coincidence!”

This would make it logical, in addition to gamma correcting hardware being expensive, that it simply wasn’t necessary to have any gamma encoding hardware incorporated in the early days of computing. Games could just be authored on a CRT, as human vision would invert the non-linear response from the CRT, leaving a pretty much linear response to intensity values. Amazing coincidence indeed.

You already have this on your mind, but for my understanding would that “simplify” the emulation gamma pipeline to:

  1. make sure the emulator outputs RAW values (undo any sRGB gamma encoding if necessary)
  2. Gamma encode the values considering a gamma correction for the difference between user’s monitor gamma (usually 2.2, or DisplayCAL measured) and target CRT gamma (2.5)

Could we possibly end up with two grade gamma settings: one for users’ PC monitor gamma (as some people may want to use their displayCAL measured value instead of default setting) and one for target CRT gamma?

1 Like

I was thinking of one other thing: Retroarch’ gfx drivers (GL / Vulkan / D3D11 etc) may be setting sRGB space by default. If I search on Retroarch github for “sRGB” I get loads of hits. Things like:

#if !defined(HAVE_OPENGLES)
   if (framebuffer && framebuffer->get_format() == GL_SRGB8_ALPHA8)
  glEnable(GL_FRAMEBUFFER_SRGB);       else
  glDisable(GL_FRAMEBUFFER_SRGB);
#endif

I wouldn’t be surprised if sRGB space is set by default for these drivers? Which could mean that your earlier comment “so the first step in grade is to linearize that with the inverse operation.” would still be valid?

Maybe @hunterk could confirm whether Retroarch gfx APIs set the framebuffer to sRGB colorspace by default?

It would be very helpful to have this confirmed to ensure that we achieve correct gamma behaviour in grade.

No, it only sets it to sRGB when you set srgb_framebuffer to “true” in the preset. That snippet says:

if *not* using GLES (i.e., if it's desktop OpenGL; GLES is weird about sRGB)
   then -> if (there's a framebuffer *and* that framebuffer's format is set to sRGB)
      use an sRGB framebuffer
      otherwise explicitly *don't* use an sRGB framebuffer
2 Likes

Thanks :slight_smile:

Good to know that for shaders the default is to -not- set sRGB.

Upon further reading it seems that it’s also possible that Gamma is set by default through the graphics API at the end of the pipeline (so unrelated to shader setting). I can imagine this is also the reason why it’s important to have it off by default in the shader, as otherwise there would be double gamma encoding.

For example for D3D in the link below it is suggested that upon writing out values to the monitor these are gamma corrected by default. If you look further this is something that can be set explicitly or not, the important part being that Gamma can also be set at this stage of the pipeline, unrelated to shader settings.

https://docs.microsoft.com/en-us/windows/win32/direct3ddxgi/using-gamma-correction

My assumption would be that it uses default behaviour for the various video drivers (D3D, GL, Vulkan), which would be that it is encoding the signal to sRGB space upon sending data to the monitor. Or, as it is worded in the link above that at this stage the video driver “looks up gamma-corrected values in the table and then sends the corrected values to the monitor instead of the actual surface values”.

Could you also confirm for this stage what the default settings are for Retroarch?

Wow, didn’t know what i was missing the past few months, but that shader does really make my life easier, thanks for that @dogway. I really appreciate this shader, since all my displays are at Rec709/100nits/BT.1886 eotf. Just set a few parameters and it outputs, like it should! :upside_down_face:

3 Likes

Ok, found some time to design a new gamma pipeline given that games could have gamma embedded.

Please test and compare with old version. What it does is basically honor the game’s embedded gamma throught the whole process until it reaches your display. The display should be calibrated to a standard gamma though, sRGB, Rec.1886, DCI, etc.

There’s a new parameter called “signal gamma”, this is the embedded gamma in the game, or in other words, the CRT gamma of the developer’s display.

1 Like

If the g_signal_type is at composite mode, you get some pretty nasty color artifacts, if it’s at RGB, it’s fine (new version to test)

1 Like

It’s working fine on my side. Might it be related to LUTs?

1 Like

It’s D3D related, on vulkan/opengl it’s without artifacts, on DX10+ it breaks. But yes, then i can give it a try

Is there any game you can think of that would demonstrate embedded gamma?

Toggling between the old grade and new looks to make it a warmer picture. On a preset calibrated with the NTSC-U at 6500k it looks too warm but with NTSC-J at 8000k it looks more color accurate. Also on the previous version I needed a black level set to -0.04 (labeled g_life) with new one on I no longer need to do that.

I am not knowledgeable on the wide array of gamma info so I’m commenting purely on it’s visual impact to the image. I think the new one is an overall improvement.

1 Like

The point here is that gamma related board circuits were very expensive to implement in the consoles, so all games had already embedded gamma in the color values.

For this reason we assume the game’s embedded gamma is the correct one and we just should cancel our calibrated display gamma to honor the game’s.

I didn’t test much but didn’t notice a shift in color. It could be interesting to test with the test suite.

2 Likes

The gamma ramps differently, doesn’t it?

If i use the SMPTE-bars and set the brightness (black level) like in the old days the 11.5 IRE bar is brighter than on the old version. Should it be the same, or did i set it up wrong? Also the slider for black level seems twice as sensitive (old version -0.06 = new version -0.03 ?).