Dogway's grading shader (slang)

No, it only sets it to sRGB when you set srgb_framebuffer to “true” in the preset. That snippet says:

if *not* using GLES (i.e., if it's desktop OpenGL; GLES is weird about sRGB)
   then -> if (there's a framebuffer *and* that framebuffer's format is set to sRGB)
      use an sRGB framebuffer
      otherwise explicitly *don't* use an sRGB framebuffer
2 Likes

Thanks :slight_smile:

Good to know that for shaders the default is to -not- set sRGB.

Upon further reading it seems that it’s also possible that Gamma is set by default through the graphics API at the end of the pipeline (so unrelated to shader setting). I can imagine this is also the reason why it’s important to have it off by default in the shader, as otherwise there would be double gamma encoding.

For example for D3D in the link below it is suggested that upon writing out values to the monitor these are gamma corrected by default. If you look further this is something that can be set explicitly or not, the important part being that Gamma can also be set at this stage of the pipeline, unrelated to shader settings.

https://docs.microsoft.com/en-us/windows/win32/direct3ddxgi/using-gamma-correction

My assumption would be that it uses default behaviour for the various video drivers (D3D, GL, Vulkan), which would be that it is encoding the signal to sRGB space upon sending data to the monitor. Or, as it is worded in the link above that at this stage the video driver “looks up gamma-corrected values in the table and then sends the corrected values to the monitor instead of the actual surface values”.

Could you also confirm for this stage what the default settings are for Retroarch?

Wow, didn’t know what i was missing the past few months, but that shader does really make my life easier, thanks for that @dogway. I really appreciate this shader, since all my displays are at Rec709/100nits/BT.1886 eotf. Just set a few parameters and it outputs, like it should! :upside_down_face:

3 Likes

Ok, found some time to design a new gamma pipeline given that games could have gamma embedded.

Please test and compare with old version. What it does is basically honor the game’s embedded gamma throught the whole process until it reaches your display. The display should be calibrated to a standard gamma though, sRGB, Rec.1886, DCI, etc.

There’s a new parameter called “signal gamma”, this is the embedded gamma in the game, or in other words, the CRT gamma of the developer’s display.

1 Like

If the g_signal_type is at composite mode, you get some pretty nasty color artifacts, if it’s at RGB, it’s fine (new version to test)

1 Like

It’s working fine on my side. Might it be related to LUTs?

1 Like

It’s D3D related, on vulkan/opengl it’s without artifacts, on DX10+ it breaks. But yes, then i can give it a try

Is there any game you can think of that would demonstrate embedded gamma?

Toggling between the old grade and new looks to make it a warmer picture. On a preset calibrated with the NTSC-U at 6500k it looks too warm but with NTSC-J at 8000k it looks more color accurate. Also on the previous version I needed a black level set to -0.04 (labeled g_life) with new one on I no longer need to do that.

I am not knowledgeable on the wide array of gamma info so I’m commenting purely on it’s visual impact to the image. I think the new one is an overall improvement.

1 Like

The point here is that gamma related board circuits were very expensive to implement in the consoles, so all games had already embedded gamma in the color values.

For this reason we assume the game’s embedded gamma is the correct one and we just should cancel our calibrated display gamma to honor the game’s.

I didn’t test much but didn’t notice a shift in color. It could be interesting to test with the test suite.

2 Likes

The gamma ramps differently, doesn’t it?

If i use the SMPTE-bars and set the brightness (black level) like in the old days the 11.5 IRE bar is brighter than on the old version. Should it be the same, or did i set it up wrong? Also the slider for black level seems twice as sensitive (old version -0.06 = new version -0.03 ?).

Things are prone to change since our interpretation of linear has changed now. The above is a test version, I’m interested on differences with everything else turned off mostly. CRT gamut, color space, saturation, temperature, etc are fine because they don’t mess with the tonal range.

I suggest to keep signal gamma and CRT gamma the same as if we calibrated our emulated CRT gamma to the developer’s CRT display.

2 Likes

I tried the new version and only see a very tiny difference when running test suite grey scale patterns. But I don’t think I would notice the difference in a double blinded test. I do like the new version better conceptually.

I’m now trying to get the best possible calibration with DisplayCal, but am a bit confused with all the settings. Would be great of you could advise on the below. If I want the best possible result with guest-dr-venom with grade included what calibration settings would I use and what 3D LUT settings:

For grade should I use “-1: rec 709” or “0: sRGB”?

Wit regards to DisplayCal settings:

On tab Calibration:

  1. which tone curve? (see picture 1)
  2. for tone curve custom - what gamma value and should it be relative or absolute (see picture 2)

On tab 3D LUT:

  1. Which tone curve (see picture 3)
  2. Which rendering intent (see picture 4)
  3. Which source colorspace (see picture 5)

You have to match grade color space with DisplayCAL TRC setting. It depends in your viewing environment. If you play daytime use sRGB, otherwise on a dim to dark environment use Rec.709 in grade and Custom in DisplayCAL (2.4 Relative when not OLED).

For the LUT settings, use Rec.709 and the same TRC settings. You need to enable “Advanced options” from the top menu and uncheck “Apply calibration (vcgt)”. You can test different Rendering intents, some people prefer Preserve Saturation despite it can change HUE’s a bit.

1 Like

Sadly the link is expired :pleading_face:

1 Like

Sorry, it was only a test version to get some feedback, didn’t expect it would take more than a week. Here it is again: https://pastebin.com/8n8nkkpw

4 Likes

I shuffled things a bit. I moved the gamut and temperature block up to the analogue region, so this is processed over rec.601 primaries as it should, and later converted to rec.709 or your SPC space. This is now happening in SMPT-C linear gamma, not sRGB transfer function.

I also calculated the temperature of each phosphor gamut matrix, before I was using standard temperatures but now each has a matching temperature to normalize them to D65, so we can later add a temperature of our like. I might change them to a Bradford matrix later on if it helps with HUE deviations.

https://pastebin.com/gECYMRjv

5 Likes

Witchcraft I say, witchcraft.

Nice update

2 Likes

I noticed g_crtgamut (Phosphor) at “0” or “-1” produces a black screen.

2 Likes

Strange it’s not happening to me but might possible be a bug on my side, are you in vulkan or gl_core?

Happens to me when I replace the first “stock” pass in the latest guest-dr-venom with grade.slang. Only setting the crt phosphor to “0” or “-1” produces the black screen. The other phosphor settings work.

Latest guest-dr-venom here: https://forums.libretro.com/t/new-crt-shader-from-guest/25444/278

guest.r has explictly added a first “stock” pass to the shader to provide room to replace it with any external pass (which I assume is the category grade.slang also fits to).

EDIT: ah, it happens only with GLcore set, not with Vulkan.