List of HDR-aware shaders?

AFAIK it’s just Megatron and guest-advanced, currently? Or are there others?

What do you mean with HDR aware? afaik there’s no way for a shader to know if the display is capable of HDR.

1 Like

It has to work within the Rec 2020 color space. For example, in guest-advanced we have color palettes that cannot be properly displayed without HDR / Rec 2020

A better way to phrase it might be: which shaders take advantage of HDR Rec 2020?

3 Likes

IIRC color correction shaders like gba-color and psp-color also have options for rec 2020.

1 Like

I use HDR with any and every shader I use in RetroArch. Also the “HDR” component in Sony Megatron is actually modular and can be chained with other Shaders.

From the RetroArch\shaders\shaders_slang\hdr\shaders\hdr10.slang file:

Part of the crt-sony-megatron shader group. Does the exact same thing as RetroArch does internally to map into HDR10 space.

This is used to do this mapping BEFORE screen effects are applied.

Originally part of the crt\crt-sony-pvm-4k-hdr.slangp but can be used for any shader

From the RetroArch\shaders\shaders_slang\hdr\shaders\inverse_tonemap.slang file:

Part of the crt-sony-megatron shader group. Does the exact same thing as RetroArch does internally to inverse tonemap from a SDR image to HDR.

This is used to do this mapping BEFORE screen effects are applied.

Originally part of the crt\crt-sony-pvm-4k-hdr.slangp but can be used for any shader

So my take is that these have been available for use with any shader for years.

Besides that, there are several other ways to “inject” HDR into RetroArch’s SDR content in order to unlock the full brightness of HDR displays.

3 Likes

Thanks- I just kinda assumed there was some kind of necessary shader component to take advantage of the wider color space but I guess everything is handled within Retroarch.

1 Like

Scanline Classic isn’t HDR aware but it does support BT.2020. You just need to set your video card to output 10-bit color and set color mode on your display to BT.2020. I believe the HDR mode will automatically do this for you.

Version 6 of Scanline Classic has an HDRR. However, it has a tone-mapper and a clamper. Getting it HDR compatible would require revising the tone-mapper and replacing the gamma correction and clamp with the correct tone curve RA expects. (Do you know what that is? Is it HLG?)

I think it should be easy to do, just need to know how to do it, and also test it. Does HDR work with Windows 10 or do you need 11?

1 Like

HDR in RetroArch works in both Windows 10 and 11. Support was added by @MajorPainTheCactus. He added global support in RetroArch around the same time he released his Sony Megatron Colour Video Monitor Shader.

The Sony Megatron Colour Video Monitor Shader is designed to be modular and various aspects of the HDR pipeline/process can be used with other shaders. You can open the shaders in the HDR folder and read what he wrote for other shader developers to build upon.

I’m not a dev so I don’t fully understand what he did or intended but maybe someone like yourself might find it useful.

@Azurfel also has an updated version of the Sony Megatron Colour Video Monitor shader where he corrected and improved on some things. You can try that one as well.

1 Like

Windows 10 doesn’t include important system level HDR calibration options, but there are 3rd party alternatives (CRU and set_maxtml). And it doesn’t support Microsoft’s AutoHDR, which i don’t think is worth using anyway. Other than that Win10’s HDR is fine.

Depending on your settings, Retroarch’s HDR color primaries are clamped to either Rec. 709/sRGB or “Expanded 709”, so you cannot use the P3 or 2020 modes in various non-HDR shaders. This also applies to the master branch of Megatron.

To the best of my knowledge, my Megatron fork is currently the only Retroarch shader that works correctly with WCG in HDR, confirmable using Lilium’s HDR Analysis ReShade shader.

At one point i also had a test version of Retroarch itself set up that replaced Retroarch’s “Expanded 709” HDR primaries with Rec. 2020. I should probably rebase that, see if i can add support for P3 as well, and put in a pull request.

ST.2084/PQ. Megatron references the Xbox-ATG-Samples, and as far as i know the Megatron HDR implementation is identical to the Retroarch one (unless something changed since i last knew).

Retroarch and Megatron’s SDR to HDR conversion is done using inverse reinhard tonemapping, which probably isn’t the correct method for direct display, tho it seems to give decent results for CRT shaders?

For direct display, a more film industry standard methodology similar to the one described by MovieLabs Best Practices for Mapping BT.709 Content to HDR10 for Consumer Distribution would likely give much more correct results, but this is beyond my capabilities to implement at this time.

1 Like

I don’t understand what you mean by not being able to use the P3 or 2020 modes. On my TV I can select a gamut manually. Then 1.0 for each channel gets mapped to the target gamut primary. Does something behave differently in HDR mode that would prevent me from doing that? Manually setting the color mode isn’t really a big deal. E.g. I have a color correction shader that applies a Rec. 601 -> XYZ -> Rec. 2020 correction and I get the right colors when I set my TV to BT.2020 mode.

1 Like

HDR always uses a Rec. 2020 container, with different content displayed using different gamuts inside of that container.

So Windows tonemaps SDR applications to display as piecewise sRGB inside the Rec. 2020 container.

Currently Retroarch HDR is Rec. 709 or “Expanded 709” inside the Rec. 2020 container depending on your settings. If you set a shader to use P3 or 2020 using a parameter setting, the gamut will be squished down to that selected gamut, resulting in a very undersaturated/washed out appearance.

The master branch of Megatron is Rec. 709 in the Rec. 2020 container if you set “HDR: Content Color Gamut” to original, or “Expanded 709” in the Rec. 2020 container if you set it to Vivid.

My Megatron fork allows you to select what gamut is used inside that Rec. 2020 container, currently any of:

  • 0=Rec 709/sRGB (SDR HDTV/Windows gamut)
  • 1=Expanded 709
  • 2=NTSC 1953 (The OG color system that was only really used for like 5-8ish years back when basically no one owned a color TV anyway. If you are Brazillian or from a SECAM region, it may also match some old CRT TVs you’ve used with really weirdly intense greens? Hard to say. This sort of thing is kind of underdocumented.)
  • 3=RCA 1958 (?1961?) (Millennial’s grandparent’s old TV with weird colors #1.)
  • 4=RCA 1964 (Millennial’s grandparent’s old TV with weird colors #2.)
  • 5=SMPTE C/Rec 601-525 line/Conrac (Baseline standard gamut for Analog NTSC.)
  • 6=PAL/Rec 601-625 line (Baseline standard gamut for Analog PAL.)
  • 7=Dogway’s NTSC-J (Baseline standard gamut for Analog NTSC-J.)
  • 8=P22_80s (Dogway’s Grade gamut for 1980s-early 1990s TVs.)
  • 9=Apple RGB/Trinitron PC (Should approximate basically any Trinitron monitor from 1987-the mid to late 1990s. By the early 00s, they were SMPTE C instead, at least for high end monitors like the FW900.)
  • 10=guest’s Philips PC (Gamut used by a number of extremely popular monitors that used Philips tubes, including Philips CM8533, Philips VS-0080, and Commodore 1084)
  • 11=P22_90s (Dogway’s Grade gamut for mid 1990s TVs with tinted phosphors.)
  • 12=RPTV_95s (Dogway’s Grade gamut for late 90s/early 00s rear projection TVs that game manuals said you shouldn’t play games on due to burn in risk.)
  • 13=Display P3/P3-D65 (Common wide color gamut. Variant on the gamut used for film with shared primaries. Might be useful in the future if someone makes a WCG pixel game that looks best with a CRT shader?)
  • 14=Rec 2020 (HDR gamut. Again, might be useful in the future if someone makes a WCG pixel game that looks best with a CRT shader.)
1 Like

This is my current SDR color pipeline:

Emulator output [0-1] -> shaders -> ‘CRT’ linear RGB input [0-inf], (1, 1, 1) equals luminance of target display -> tone map to [0, 1] -> convert to sRGB -> compress gamut to [0, 1] -> gamma correct

WCG is this:

Emulator output [0-1] -> shaders -> ‘CRT’ linear RGB input [0-inf] -> tone map to [0, 1] -> convert to BT.2020 -> gamma correct (no need for gamut compression)

It sounds like HDR would be this?

Emulator output [0-1] -> shaders -> ‘CRT’ linear RGB input [0-inf] -> convert to sRGB -> compress gamut (negative colors only) -> tone map values so that (1,1,1) from emulator equals 100 nits avg. after processing (scanlines, mask, etc.), lower values align with BT.1886 gamma curve (or 2.4 approx.), higher values -> PQ

That final tone map is supposed to restore brightness and correct for gamma so that it has the same response as a CRT (or whatever). If it can be done without harming the color correction, that works. But if not:

Emulator output [0-1] -> shaders -> ‘CRT’ linear RGB input [0-inf] -> tone map values so that (1,1,1) from emulator equals 100 nits avg. after processing (scanlines, mask, etc.), lower and higher values scale linearly -> convert to sRGB -> compress gamut (negative colors only) -> tone map to correct gamma response

This assumes we can just use sRGB because we don’t need to worry about color values greater than 1. I don’t think I know enough about how HDR works…