Sony Megatron Colour Video Monitor

Would it be possible to set up a specific trigger to activate HDR instead of relying on the framebuffer format?

It happened to me to have triggered it with the float framebuffer (16bpc?) keyword output in the slangp, and it was unwanted, as I just needed more color precision in accessing the last pass feedback.

2 Likes

@Azurfel I omitted the scale_type line and, after restoring the proper PQ function, the shader is now in a state I can work with, thanks! Has this issue been reported on GitHub?

2 Likes

Nope. I’m in agreement with @kokoko3k that cutting to the chase and adding an explicit trigger function of some sort would be the better way to go about this.

Something as simple as shaders always being treated as SDR unless something like “#HDR_ACTIVATE” is placed in the next line after the #pragma, and always being treated as HDR if the line is present seems like a far better, simpler solution than trying to refine the current detection methodology.

2 Likes

…or just a metadata tag in the slangp, rather than the .slang shader file itself ?

hdr_output = “1” or similar

btw, nice trick to use scale_type to disable it, I could use it as an unintended workaround :slight_smile:

1 Like

Ooo, yeah, i like this idea better.

Ok so what the scale_type does is to create a separate render target buffer under the hood from the swap chain buffer (and it adds an extra hidden copy pass) and so yes the internal HDR shader will kick in (instead of the copy pass - it instead does a HDR conversion and copy pass). We can probably change that by recognising scale_type has been added and so the normal extra copy pass should just be used as #pragma format has been used in the last pass and that pass is going to do the HDR10 colour conversion itself.

As for @kokoko3k I’m not sure I understand your issue: switching HDR on in the menu changes the swap chains colour space. If you have a shader and it outputs SDR values it will always be interpreted incorrectly no matter the swap chain format you use and you need to convert to HDR10 colour space. The correct solution is to detect that HDR is on and convert to HDR10 colour space yourself - which you can now do via the ‘EnableHDR’ shader parameter. If you turn off HDR in the menu then youre fine with your RGBA16 float buffer as nothing will need to be converted to HDR10 colour space and the internal HDR shader will be disabled. Is something not working as it should maybe?

As for the more explicit shader support firstly you need to add that to the slang compiler which is why it wasnt done in the first place - its a bit weird to be adding swap chain colour space support to a shader or preset pass though and probably will cause bugs and user confusion as it will conflict with the main menu setting.

If you dont want the interal shader to convert to HDR for you then you’d need to add an extra pass that converts SDR to HDR colour space and defining #pragma format allows us to say to RetroArch dont do anything we’ll take care of it. The final pass of any shader chain writes directly to the swap chain buffer (which is intrinsically tied to the swap chain colour space) unless you use scale_type BUT something needs to convert the colour space still as otherwise it wont look correct.

So the question I have is: when do you want to use a RGB10A2 or RGBA16 float in HDR and have the internal HDR shader kick in? Is it that you dont want to convert to HDR10 colour space yourself? If so then just add the pass in slang-shaders\hdr\hdr.slangp and it will do the same thing OR add the scale_type as described at the top here.

1 Like

I think modifications to the shader compiler should be taken with care. Since resources are so limited, best to focus on the bugs and establish a baseline functionality. I have made some notes for updating the SLANG format docs including a section on HDR shader programming and how RA is supposed to handle SDR data.

From the shader developer’s perspective we have three classes of shaders that do fundamentally different things. To properly ensure our shader is displayed correctly when HDR is enabled:

SDR: set ACCURATE color space in HDR setting, set 8-bit texture format or assume it will be the default, inverse tone mapper will be applied

HDR: specify 10-bit format, encode color data as PQ, ensure scale_type is not set for the final pass <- this part isn’t intuitive

WCG: set color space in HDR settings to SUPER, append hdr_v2.slang, ensure scale_type is not set <- this compounds the unintuitive aspect even further; the shader is not HDR, why do we need to append a shader labeled hdr to get it to work correctly?

First priority: fix the scale_type bug. This is affecting both the HDR and WCG shader use cases now.

Second priority: either WCG shaders will have to be updated to include hdr_v2.slang or the shader compiler will need to be updated to support something like hdr_output = 1 or #define PQ_ENCODED or some other way where the shader tells RetroArch ‘this is PQ-encoded output, don’t apply the inverse tone-mapper’. To be clear, this isn’t an override of any of the settings in RA itself, it’s simply aiding RA in identifying what functions it needs to apply for HDR rendering.

Third priority: decoupling of tone mapping parameters from various HDR functions. The current settings don’t make sense from a user-standpoint either: if I set the peak luminance to actually match my display, this means the inverse tonemapper is going to make SDR content always reach that level. Let me ask this question: if you have a 10 000 nit display, do you actually want SDR content to be able to display 10 000 nits? The description should be changed from ‘set this value according to rtings measurements’ to ‘set this value to the maximum brightness SDR content should be allowed to display’. The UI should have its own tone map settings, as most people will want the UI dimmer than content overall.

With this perspective, the toHDR shaders getting values from the RA interface rather than from the parameter menus is probably not a good idea and not intuitive. If someone is using a shader then they would expect to have full control over that shader through the shader’s parameters. Instead it would probably be better to state that the HDR inverse tone-mapper settings apply only to SDR content and don’t affect HDR shaders. To aid users, an ‘HDR’ indicator can be added to the UI showing that the content/shader is detected as HDR.

I would also recommend a switch in the interface to turn off the inverse tone-mapper entirely and just set brightness with paper white, so one doesn’t need to set peak luminance to paper white to bypass it.

Youre over complicating things a bit here. The only thing you need to think about is your final pass’s format:

  • RGBA8 (default): ignore all HDR settings, Retroarch will take care of HDR for you.
  • RGB10A2 and RGBA16: you take complete control of all HDR calculations when HDR is enabled*

What the main menu says with regards to colour boost is neither here nor there as you can completely ignore what it says - its purposefully high level so that it doesn’t conflict with shader parameters that may or may not do the same thing.

The menu settings are really only for the internal shader - I’ve just exposed them to the shaders should you want to use them. You are free to interpret them in what ever way you want but you can absolutely ignore them if you want. The shader parameters are your shaders main config menu.

*Right so there is a bug here in RGBA16 float and its probably what is affecting @kokoko3k: we specifically say turn on the HDR10 conversion and turn off the inverse tonemapper. I think this should be all turned off just like RGB10A2 as in the shader writer takes complete control.

So we have two bugs:

  1. When scale_type is used and the #pragma format is set then it shouldnt use the internal HDR shader - currently it does
  2. RGBA16 should behave just like RGB10A2 - the shader takes over complete control of HDR instead of the current half way house.

As for the menu label help text - I think this is a practicality thing for the vast majority of users as in no TV currently goes any where near 10,000nits all OLEDs have ABL limiters and LCD’s have lower full screen peak luminance values. The current text is fairly straight forwardly actionable by Joe Bloggs on the street - you set peak once and then mess around with paper white. The ideal thing here would be to have a HDR calibration screen for users but time and resources prevent that.

2 Likes

With proposed bug fixes workflow will look like this:

SDR: identical, no changes needed

WCG: WCG shader developers need to add hdr_v2.slang as last pass to ensure correct output on HDR mode (or handle the conditional themselves)

HDR: no changes needed

I think this is a reasonable compromise. There are probably not that many WCG shaders to update. Then can update SLANG docs to state that if a 10-bit+ format is specified, shader is expected to output HDR10 format.

1 Like

There isnt really a thing as a WCG shader from Retroarch’s perspective: it either has a 709 or 2020 colour space swap chain. WCG shaders are entirely up to you whether you want to support them or not, just like any other shader feature. If you do want to support that colour space or any other colour space just change your last pass format to rgb10a2 and take control - roll your own or copy the relevant functions or include the headers which ever you prefer.

1 Like

I think the point from a shader developer’s perspective is that the developer shouldn’t need to have knowledge about how HDR works in order to develop a WCG shader. With the way RA works now, a WCG shader has to output the correct PQ-encoded format to display correctly in HDR mode. That would also apply to SDR, but it doesn’t because RA provides the HDR10 conversion for SDR-detected shaders as a convenience. I think expanding that convenience to WCG shaders is what we’re hoping to get long-term.

Appending the hdr_v2.slang to the end of chain ourselves is a reasonable compromise. We don’t need to know the implementation details of HDR, we just need to know that has to be at the end of the chain. The wording in the SLANG documentation could be something like ‘If you use a format other than 8-bit, you will need to convert your output to HDR10 using the EnableHDR uniform. For convenience, you can use the hdr_v2.slang pass to do this for you.’

1 Like

I kind of disagree that shader developers shouldnt need to know how HDR works to develop WCG shaders: the reason being that Retroarch and Windows dont directly support DCI-P3 or Adobe RGB or ACES etc They only support the larger HDR10 container space rec2020 and so you have to do the work to convert to that space. By the time youve got your PQ you are basically only talking about a matrix multiply extra for the colour space itself its not a biggee. Im a firm believer thats its best left to you what you want to do and RetroArch, Windows etc just supports the basics.

The whole internal HDR shader thing is because HDR wasnt a thing for years while RetroArch was about and so a huge library of shaders didnt support HDR and something needed to be done to support them out of the box and this seemed the best thing to do at the time rightly or wrongly.

1 Like

https://www.displayspecifications.com/en/model/64b3462a

https://www.avsforum.com/threads/tcl-98-inch-x11l-sqd-mini-led-review-–-not-rgb-not-oled-something-else.3341308/

https://www.youtube.com/watch?v=ekBsy-G4czg

1 Like

Early in the morning and I admit I didn’t get the whole replies, sorry :stuck_out_tongue:

I don’t use HDR. In the past I tried to use the float_framebuffer directive for the last pass just to sample with more precision from its feedback and an user running Windows reported me the colors were all washed out.

Since sampling from the last feedback with whatever arbirtary precision is a legit case, I think the hdr triggering should be handled at higher level via the slangp/parser, so with a new specific keyword, rather than tied to the framebuffer color format, that’s it.

2 Likes

@MajorPainTheCactus As i understand @kokoko3k’s suggestion: would it be possible to add an option along the lines of “hdr_mode(x) = true/false” for slangp presets? Similar to things like “srgb_framebuffer(x) = true/false” or “filter_linear(x) = true/false”?

Such that, if that line is absent or set to “hdr_mode(x) = false” on the final preset of the chain, RetroArch treats it as an SDR shader, regardless of scale_type or pragma format or whatever else?

And if it has “hdr_mode(x) = true” for the final shader of the chain, RetroArch treats it as an HDR shader, again, regardless of scale_type or pragma format or whatever else?

1 Like

It could be hdr_mode = auto by default, which would retain the current behaviour, and on/off to force it.

1 Like

Yes sadly this is a case of ‘if only life were so simple’.

Your final buffer of your final pass is special: in the general case it is directly the swap chain’s buffer. The swap chains buffer is special because it tells the gfx driver what format your data is in so that it can pass it to the monitor (which strictly takes in a 10bit signal in HDR and 8bit signal in SDR), this is done through a combination of both the format and the colour space. As such you need to do the appropriate work to move your buffer into the correct colour space.

Fortunately for Retroarch SDR shader developers who have just dealt with SDR, when HDR is turned on I added a hidden extra pass that does this SDR to HDR work for you: largely inverse tonemap, HDR10 colour conversion and PQ ST2084 conversion amongst a few other bits and bobs. But when you use 16bit float buffers life changes for you as it directly has an impact on how the gfx driver is going to interpret that data.

In the case of your 16bit buffer it will clamp values above 1.0 to 1.0 and it will dither that extra precision down to 10bit and also it will use the colour space to interpret those values differently. So you must do some work to get your colours into the correct space, namely all the things I said above - Retroarch wont help you here* as you’re off piste so to speak and its up to you how to map things down. We’d have to start adding loads of shader code and options in Retroarch to do that otherwise and invariably shader developers wouldnt be happy.

Now I did say RetroArch wont help you here but actually it currently does but I’m about to change that as its a half way house which is probably the worst of both worlds: currently if HDR is on and you choose float16 buffer format it will do the HDR10 and PQ conversion for you but NOT the inverse tonemapper.

I think there are a lot of reasons for you to take care of the HDR10 and PQ conversion as well and so I’m just about to stop it doing that. The other issue is that this extra pass that gets added (both the original SDR version (when scale_type) is used and the HDR version) ignore your output format type and has its own output format type effectively you lose the precision you wanted passed to the gfx driver. So we actuially have yet another bug when scale_type is defined both in SDR and HDR.

So you have four options to support HDR when outputting a 16bit buffer:

  1. roll your own funcitons
  2. copy in the hdr functions into your code from hdr10.h/ gamma_correct.h etc
  3. include those headers directly
  4. Add this preset pass to your preset:

shader0 = “hdr/shaders/hdr_v2.slang”

or for completeness:

shader0 = “hdr/shaders/hdr_v2.slang” filter_linear0 = “false” wrap_mode0 = “clamp_to_border” mipmap_input0 = “false” alias0 = “” float_framebuffer0 = “false” srgb_framebuffer0 = “false”

There is of course a 5th option ignore HDR and leave your shaders broken in HDR land.

1 Like

You have that already:

shader0 = “hdr/shaders/hdr_v2.slang”

(replace the 0 with the index of your last pass +1)

This currently works and requires no additional logic, no unique keywords, no additional parsing logic etc

1 Like

I think that was the bit I was missing. Sorry if I ask you before fully trying to understand the surely meaningful post, for which I thank you, but what you wrote means that if i set the last pass to 16pbc, the underlying issue is that the OS/Driver would display wrong colors, as it thinks I want to display HDR content? Just yes/no would be ok, I’ll take that info to better understand the rest :wink:

2 Likes

No its not that you just set the last pass to 16pbc its that the last pass is usually the swap chain which is special.

2 Likes