Dogway's grading shader (slang)

Thanks for the insight, good to know grade already incorporates the adjustments to the gamma curve with the linear segment towards black.

It’s quite interesting to see how important gamma is to correct color reproduction. I was thinking of reproducing a vintage CRT’s gamma of 2.5 theoretically with the current gamma pipeline, but then I realized I don’t quite understand the complete pipeline. Hopefully you can shed some light on this. Purpusedly I’m using a RGB intensity value below to see what goes in and what comes out.

Emulator gamma pipeline

  1. BSNES core outputs raw RGB pixel value (from range 0-255) : R(ed)=128
  2. Dogway’s Grade encodes value with CRT gamma 2.5: R(ed) = ((128 / 255) ^ (1 / 2.5)) * 255 = 194
  3. guest-dr-venom gamma_input DECODES it with gamma 2.4: R(ed) = ((194 / 255) ^ (2.4)) * 255 = 132
  4. guest-dr-venom gamma_out encodes it with gamma 2.4: R(ed) = ((132 / 255) ^ (1 / 2.4)) * 255 = 194
  5. Retroarch graphics API outputs value for videocard. Retroarch gfx api encodes it with gamma 2.2 : R(ed) = ((194 / 255) ^ (1 / 2.2)) * 255 = 225
  6. My PC attached IPS monitor decodes value with gamma 2.33 (DisplayCAL measured): R(ed) = ((225 / 255) ^ (2.33)) * 255 = 190

So for my pipeline with Grade gamma at 2.5 an RGB pixel intensity value of R(ed)=128 goes in and an RGB pixel intensity value of R(ed)=190 gets sent to my eyes.

Real SNES hardware gamma pipeline

So what happens with a real SNES connected to vintage CRT? Does the SNES do gamma encoding or not? Both cases below:

SNES does gamma encoding?

  1. SNES encodes R(ed)=128 with gamma 2.5: R(ed) = ((128 / 255) ^ (1 / 2.5)) * 255 = 194
  2. Vintage CRT decodes it with gamma 2.5: R(ed) = ((194 / 255) ^ (2.5)) * 255 = 128

SNES outputs raw RGB intensity values (NO gamma encoding)

  1. SNES outputs R(ed)=128
  2. Vintage CRT decodes it with gamma 2.5: R(ed) = ((128 / 255) ^ (2.5)) * 255 = 46

Mismatch emulator output and real SNES:

So for a real SNES connected to a vintage CRT (gamma=2.5) when the SNES ouputs an intensity value for a red pixel of 128, then what gets displayed on the vintage CRT monitor is an intensity value of either 128 or 46, depending on whether a real SNES does gamma encoding of the output signal or not.

In the emulator gamma pipeline when the RA Bsnes core outputs an intensity value for a red pixel of 128 and the Grade shader CRT output is set to gamma=2.5 then an intensity value of 190 is displayed on the LED monitor. A large discrepancy versus real SNES hardware connected to a vintage CRT monitor!

@Dogway Since the discrepancy between the input and output for both pipelines is so large I’m sure the pipelines will very probably be different from what is described above.

Purposedly this is a starting point to get to the real answer to both pipelines.

My question is how the real pipelines for both the “Emulator gamma pipeline” and “Real SNES hardware gamma pipeline” are. Could you possibly take both pipelines depicted above and rearrange / add / remove steps where necessary, such that we get to a true picture of both pipelines?

1 Like

In your examples you are using power law gamma operations but as I said grade uses “sRGB” type gamma functions with a linear segment towards black. I do several operations for gamma because no developer could confirm to me why the emulators are not outputing RAW values and instead rely on a gamma encoded output. For this reason I assume the output is already sRGB gamma encoded, so the first step in grade is to linearize that with the inverse operation. Once linearized I apply a SMPTE gamma function with a gamma_in value. In your case 2.5. This is now our CRT ground truth. We relinearize it with the sRGB inverse function, do all our grade operations and output with the color_space related gamma, if sRGB it will do a gamma cancel and get a match for the CRT ground truth.

If you want to get the correct gamma output from Retroarch to match your calibrated display you have to color manage retroarch. For that use DisplayCAL and use the Reshade LUT output, you can load that in grade. (Or calibrate your display to 2.2 gamma)

The snes has to encode gamma for the composite signal, it uses the SMPTE gamma function.

2 Likes

Thanks for the explanation, I think I understand the proces now:

  1. bsnes core output is assumed to be encoded with srgb gamma (2.2)
  2. grade shader decodes output from 1 with inverse operation: we have RAW values now
  3. RAW values are encoded by grade with SMPTE gamma function (2.5 in my case)
  4. grade linearizes again with sRGB inverse operation, does grade operations, and outputs/encodes to target space, which if sRGB cancels out the earlier gamma operation (output values of bullet 3. are maintained)
  5. LED monitor decodes with sRGB gamma 2.2 (calibrated)

So in summary, if we strip out all canceling encoding/decoding steps :

  1. effectively RAW values are encoded by grade with SMPTE gamma function and value of 2.5 (in my case; a setting between 2.35 and 2.55 is recommended in grade shader guide)
  2. LED monitor decodes with sRGB gamma 2.2 (calibrated)

Compare this to real SNES hardware connected to a vintage CRT, as that will most likely look like:

  1. RAW values are encoded by SNES hardware with gamma of 2.35 - 2.55 (is the real value known?).
  2. Vintage CRT decodes with gamma of 2.35 - 2.55 (the exact value will differ per TV set)

Step 1 is comparable between emulation and reaI hardware.

But I see an issue with step 2.: a vintage CRT decodes with gamma between 2.35 - 2.55 while in the emulated setup the LED monitor decodes at 2.2 (calibrated).

So what am I overlooking in this?

1 Like

Yes, gamma is a complex topic, look at this paper to know all the typical different gammas.

The SNES has to gamma encode the signal before modulation. I just read that gamma correction hardware is/was very expensive, so my guess is that the games are already gamma corrected. That would change things a bit, but in any case I opted to keep things open.

On paper the signal is encoded with 2.222 SMPTE gamma to compensate for the CRT 2.2 calibrated gamma. In reality the CRTs were more about 2.5 gamma and the signal at least for SNES games were authored with 2.5 gamma (?).

I chose to keep the encoding-decoding within a single function (gamma_in) so it is more user friendly. In the end it is very similar to your first bullet point for the SNES hardware, probably more towards 2.35 gamma, but who knows.

On your last issue, it’s not an issue at all. If you take a greyscale ramp with gamma 2.2 and raise the gamma, the view referred greyscale gamma is now not canceled at all regardless of monitor gamma, you are leaving the offset which is what we care about.

I might try the above things (gamma assumed) but I’m a bit busy lately.

1 Like

I had a similar thought :slight_smile: . In the poynton paper mentionded earlier, there’s this interesting quote:

“A CRT’s response is very nearly the inverse of the lightness sensitivity of vision: <…>The fact that a CRT’s transfer function is very nearly the inverse of the lightness sensitivity of vision is an amazing, and fortunate, coincidence!”

This would make it logical, in addition to gamma correcting hardware being expensive, that it simply wasn’t necessary to have any gamma encoding hardware incorporated in the early days of computing. Games could just be authored on a CRT, as human vision would invert the non-linear response from the CRT, leaving a pretty much linear response to intensity values. Amazing coincidence indeed.

You already have this on your mind, but for my understanding would that “simplify” the emulation gamma pipeline to:

  1. make sure the emulator outputs RAW values (undo any sRGB gamma encoding if necessary)
  2. Gamma encode the values considering a gamma correction for the difference between user’s monitor gamma (usually 2.2, or DisplayCAL measured) and target CRT gamma (2.5)

Could we possibly end up with two grade gamma settings: one for users’ PC monitor gamma (as some people may want to use their displayCAL measured value instead of default setting) and one for target CRT gamma?

1 Like

I was thinking of one other thing: Retroarch’ gfx drivers (GL / Vulkan / D3D11 etc) may be setting sRGB space by default. If I search on Retroarch github for “sRGB” I get loads of hits. Things like:

#if !defined(HAVE_OPENGLES)
   if (framebuffer && framebuffer->get_format() == GL_SRGB8_ALPHA8)
  glEnable(GL_FRAMEBUFFER_SRGB);       else
  glDisable(GL_FRAMEBUFFER_SRGB);
#endif

I wouldn’t be surprised if sRGB space is set by default for these drivers? Which could mean that your earlier comment “so the first step in grade is to linearize that with the inverse operation.” would still be valid?

Maybe @hunterk could confirm whether Retroarch gfx APIs set the framebuffer to sRGB colorspace by default?

It would be very helpful to have this confirmed to ensure that we achieve correct gamma behaviour in grade.

No, it only sets it to sRGB when you set srgb_framebuffer to “true” in the preset. That snippet says:

if *not* using GLES (i.e., if it's desktop OpenGL; GLES is weird about sRGB)
   then -> if (there's a framebuffer *and* that framebuffer's format is set to sRGB)
      use an sRGB framebuffer
      otherwise explicitly *don't* use an sRGB framebuffer
2 Likes

Thanks :slight_smile:

Good to know that for shaders the default is to -not- set sRGB.

Upon further reading it seems that it’s also possible that Gamma is set by default through the graphics API at the end of the pipeline (so unrelated to shader setting). I can imagine this is also the reason why it’s important to have it off by default in the shader, as otherwise there would be double gamma encoding.

For example for D3D in the link below it is suggested that upon writing out values to the monitor these are gamma corrected by default. If you look further this is something that can be set explicitly or not, the important part being that Gamma can also be set at this stage of the pipeline, unrelated to shader settings.

https://docs.microsoft.com/en-us/windows/win32/direct3ddxgi/using-gamma-correction

My assumption would be that it uses default behaviour for the various video drivers (D3D, GL, Vulkan), which would be that it is encoding the signal to sRGB space upon sending data to the monitor. Or, as it is worded in the link above that at this stage the video driver “looks up gamma-corrected values in the table and then sends the corrected values to the monitor instead of the actual surface values”.

Could you also confirm for this stage what the default settings are for Retroarch?

Wow, didn’t know what i was missing the past few months, but that shader does really make my life easier, thanks for that @dogway. I really appreciate this shader, since all my displays are at Rec709/100nits/BT.1886 eotf. Just set a few parameters and it outputs, like it should! :upside_down_face:

3 Likes

Ok, found some time to design a new gamma pipeline given that games could have gamma embedded.

Please test and compare with old version. What it does is basically honor the game’s embedded gamma throught the whole process until it reaches your display. The display should be calibrated to a standard gamma though, sRGB, Rec.1886, DCI, etc.

There’s a new parameter called “signal gamma”, this is the embedded gamma in the game, or in other words, the CRT gamma of the developer’s display.

1 Like

If the g_signal_type is at composite mode, you get some pretty nasty color artifacts, if it’s at RGB, it’s fine (new version to test)

1 Like

It’s working fine on my side. Might it be related to LUTs?

1 Like

It’s D3D related, on vulkan/opengl it’s without artifacts, on DX10+ it breaks. But yes, then i can give it a try

Is there any game you can think of that would demonstrate embedded gamma?

Toggling between the old grade and new looks to make it a warmer picture. On a preset calibrated with the NTSC-U at 6500k it looks too warm but with NTSC-J at 8000k it looks more color accurate. Also on the previous version I needed a black level set to -0.04 (labeled g_life) with new one on I no longer need to do that.

I am not knowledgeable on the wide array of gamma info so I’m commenting purely on it’s visual impact to the image. I think the new one is an overall improvement.

1 Like

The point here is that gamma related board circuits were very expensive to implement in the consoles, so all games had already embedded gamma in the color values.

For this reason we assume the game’s embedded gamma is the correct one and we just should cancel our calibrated display gamma to honor the game’s.

I didn’t test much but didn’t notice a shift in color. It could be interesting to test with the test suite.

2 Likes

The gamma ramps differently, doesn’t it?

If i use the SMPTE-bars and set the brightness (black level) like in the old days the 11.5 IRE bar is brighter than on the old version. Should it be the same, or did i set it up wrong? Also the slider for black level seems twice as sensitive (old version -0.06 = new version -0.03 ?).

Things are prone to change since our interpretation of linear has changed now. The above is a test version, I’m interested on differences with everything else turned off mostly. CRT gamut, color space, saturation, temperature, etc are fine because they don’t mess with the tonal range.

I suggest to keep signal gamma and CRT gamma the same as if we calibrated our emulated CRT gamma to the developer’s CRT display.

2 Likes

I tried the new version and only see a very tiny difference when running test suite grey scale patterns. But I don’t think I would notice the difference in a double blinded test. I do like the new version better conceptually.

I’m now trying to get the best possible calibration with DisplayCal, but am a bit confused with all the settings. Would be great of you could advise on the below. If I want the best possible result with guest-dr-venom with grade included what calibration settings would I use and what 3D LUT settings:

For grade should I use “-1: rec 709” or “0: sRGB”?

Wit regards to DisplayCal settings:

On tab Calibration:

  1. which tone curve? (see picture 1)
  2. for tone curve custom - what gamma value and should it be relative or absolute (see picture 2)

On tab 3D LUT:

  1. Which tone curve (see picture 3)
  2. Which rendering intent (see picture 4)
  3. Which source colorspace (see picture 5)

You have to match grade color space with DisplayCAL TRC setting. It depends in your viewing environment. If you play daytime use sRGB, otherwise on a dim to dark environment use Rec.709 in grade and Custom in DisplayCAL (2.4 Relative when not OLED).

For the LUT settings, use Rec.709 and the same TRC settings. You need to enable “Advanced options” from the top menu and uncheck “Apply calibration (vcgt)”. You can test different Rendering intents, some people prefer Preserve Saturation despite it can change HUE’s a bit.

1 Like

Sadly the link is expired :pleading_face:

1 Like