Sony Megatron Colour Video Monitor

Yes we know that, thats why we have a 10bit HDR swap chain or 10bit/16bit intermediate buffers. BUT RetroArch has NEVER offered anything other than an 8bit swap chain in SDR. When you put float frame buffer in preset or use the #pragma format in the last pass RetroArch ignores it and you write to an 8bit buffer.

What are you basing this off of? You’ve only been able to see a difference when youve switched on HDR and used a 10bit format in RetroArch.

Not in SDR they can’t - I bet that the Nvidia/Intel/AMD driver doesn’t even allow you to have r709 colour space with a 10bit swap chain. It may be in the spec but the reality is usually very different - I’d have to write a test to prove that but the fact no game has ever supported this points to it.

Here’s the math even with Windows 11 Auto Colour Management on (a internal scRGB DWM compositor) for 100 and 99 pixel values:

(100 / 255) * 1023 = 401.1 (rounded to 401 in 10bit - .1 is used to trigger temporal dither)

(99 / 255) * 1023 = 397.1 (rounded to 397 in 10bit - .1 is used to trigger temporal dither)

Same relative gap and the DMW/GPU output hardware’s temporal dithering (based on the decimal point) will cancel out so the dither doesn’t help either. The only time this will help is when you have Windows compositing stuff i.e Steam overlays etc BUT even then in the vast majority of cases it wont affect the image in the slightest. In the 99.9% of cases you’re just wasting electricity.

Yes it is the last pass BUT its not 10bit its 8bit - it completely ignores you asking for a 10bit buffer in D3D11, D3D12 and Vulkan RetroArch drivers - and always has done. Same goes for float_framebuffer in the preset and 16bit float format request in the shader with #pragma format.

1 Like

No it doesn’t work like that and never has see my last reply - you need HDR to benefit from a 10bit swap chain and 16bit swap chain has never been supported in RetroArch.

1 Like

I guess I need to reopen my feature request for RetroArch to support a 10-bit swapchain for WCG use case. I assume the inserted final shader doesn’t do any dithering, right?

Of course it does. How else could apps like Photoshop and calibration software work? Windows just doesn’t set the color space for you. In the past you had to control the monitor modes yourself (or use software designed to work with the monitor). The video driver dictates the final output, not the OS. In the past it was only supported by professional cards, but it has been available universally since Turing.

Supposedly Alien: Isolation supports 10-bit SDR.

ETA: Even with RTX card, there are different cases with whether or not 10-bit output is supported. Proposed solution should be for RA to query valid swap-chain formats and choose the best given the requested last pass output format instead of just choosing 8-bit.

1 Like

That isn’t how SDR works. SDR output is generally color space agnostic (aside from a handful of somewhat hacky metadata-based extensions to the Blu-ray spec, like Sony’s “Mastered in 4k” line of WCG 1080p Blus, or 4Ks with WCG SDR).

In Windows, SDR is just RGB range/color-channel-saturation, and bit-depth/granularity, with no inherent recognition of color gamut, which the end user is expected to manage appropriately using their display’s settings and, if necessary, the Color Management system.

That is why we can display the SDR output of emulators/cores as whatever arbitrary color gamut we wish.

There is generally a correct color gamut to display a given SDR signal in, obviously, but that isn’t actually enforced by or even properly documented within the signal. One must use context clues to determine the appropriate gamut.

Sometimes the true correct gamut is unknowable, and we can only make a best guess. Especially for older material or material that wasn’t produced using calibrated displays, the true correct color gamut is whatever the screen they happened to be made on was outputting, and we will never know what exactly that really was.

As @anikom15 mentioned, Alien: Isolation supports 10-bit SDR output (they call the option Deep Color if i remember right).

But even modern HDR games generally don’t properly leverage WCG support. SDR output with presumed (but not enforced, because again, SDR doesn’t really work that way) Rec709 primaries remains the default, so there hasn’t really been much effort put into actually taking advantage of the possibilities offered by WCG support.

2 Likes

This is true, and it’s largely because games need to be legible in a variety of conditions more than films do and you can’t just grade two separate masters like a movie. You can release a game with HDR support and DCI but then what will sRGB SDR players experience? You do not want to deliver crushed colors, and developers simply cannot afford the time and expense to have a completely separate tonemapping and gamut compression system, debug it by going through the whole game making sure you can see clearly inside dark areas and what not, to make sure the SDR sRGB version is completely legible. Hence even today we have AAA games coming out in SDR (still looking great, mind you).

It doesn’t help we now have people wanting to play on handhelds and VR which are not great viewing conditions. That means limited contrast ratios and clarity, less room for error if outputting multiple color spaces, especially for 3D where lighting models have a huge range of nit values.

2 Likes

I think it’s also that theatrical releases have been standardized around using a WCG container (DCI-P3) for the last 20 years, just as the gaming industry has been nominally targeting various flavors of more or less Rec709/sRGB for the last 20 years. So it was a much easier sell to convince people in the film industry to switch to a Rec2020 container (especially with the stick that was Netflix demanding HDR deliverables, and the carrot that was Dolby offering a much better systemic solution for generating additional derivative grades.)

The flip-side is that people in the film industry have been largely disinterested in the possibilities of the additional brightness offered by HDR, since they are focused primarily on 50ish and 100ish nit theatrical presentations. While game developers have been much more willing to experiment with that aspect.

1 Like

@MajorPainTheCactus

Looks like Colour Boost “Wide” is targeting the wrong primaries again in the latest nightly. (Assuming that setting is still meant to be P3-D65 primaries anyway.)

“Expanded” primaries have also changed significantly.

Same applies for the Wide and Expanded settings in the current versions of hdr-config.slangp and hdr.slangp.

When used in vulkan or d3d12, scRGB mode seems to fail to tonemap content while the menu or a widget is visible. This appears not to be an issue in d3d11.

And for the record: at some point, i still hope to see improvements in Megatron’s color rendition and accuracy that bring it back up to the level of 2.6.2 with the “Colour Accurate” setting.

1 Like

It is how SDR works I’m afraid: the swap chain ALWAYS has a colour space - sure you may not need to do anything other than gamma correct because we implicitly are in r709 but your swap chain ALWAYS has a colour space in SDR it’ll either be: DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709 in DXGI or VK_COLOR_SPACE_SRGB_NONLINEAR_KHR for Vulkan.

Blockquote As @anikom15 mentioned, Alien: Isolation supports 10-bit SDR output (they call the option Deep Color if i remember right).

No we dont actually know that - although the AI has told you about this deep color option we have no idea whether its actually talking about a 10bit SDR swap chain. BUT its beside the point: no one uses a 10bit SDR swap chain to all intents and purposes - even in Alien Isolations potential case the default is an 8bit SDR swap chain. The AI has named a single release from 12 years ago and thats it out of the tens (hundreds?) of thousands of game release. We’ve got one potential game where you have to switch on a potential switch on a game and override your drivers plus back then none of the monitors would probably have supported it (and I guess none of the graphics cards/drivers did either). 99.9999% of use cases outside of professional graphics do not use a 10bit SDR swap chain (its to that precision because its an option no one probably used in Alien Isolation even if it did do what we’re saying it does).

I’ve made a huge amount of changes so its a no wonder certain things have broken (I have only a certain amount of time) - I’ll take a look at wide.

1 Like

“And for the record: at some point, i still hope to see improvements in Megatron’s color rendition and accuracy that bring it back up to the level of 2.6.2 with the “Colour Accurate” setting.”

‘Colour Accurate’ just meant the r2020 colour space conversion was done after the scanline simulation - this destroyed the mask being correct. We have the same values now - I’ve tested it with Renderdoc BUT we have correct masks. What could well be going on is that because each CRT ‘pixel’ is now spread over four LCD/OLED pixels Lilium’s HDR shader is incorrectly telling you the colours are off. I mean its right from a per pixel perspective they are off, but when you spread over the colour from 4 pixels i.e 1 foot away (just like a CRT) they are correct.

It’d be good to get an external colorimeter to check.

If you dont think thats what you are seeing then do let me know there can always be a bug that has crept back in.

1 Like

scRGB HDR Support

Ok so I decided to add scRGB support to RetroArch over the weekend. scRGB is another version of HDR based around 16bit float buffers and is what Windows 11 uses in the DWM when Auto Colour Management is on i.e for its compositor. Its seen as the gold standard for HDR on Windows but I’m not so sure that is true and probably more marketing hype than reality as the hardware still quantises the 16bit buffer to a 10bit signal and so you may well get better results intelligently doing it yourself rather than letting the hardware apply a dumb quantisation.

So we now support 3 swap chain formats selectable from the HDR menu:

HDR Mode Swapchain Format
Off r709 sRGB 8-bit UNORM (B8G8R8A8)
HDR10 HDR10 PQ 10-bit UNORM (A2B10G10R10)
scRGB scRGB Linear 16-bit float (R16G16B16A16F)

We also still have the three final pass formats: 8bit unorm, 10bit unorm and 16bit float for a total of 9 combinations! Plus we have three drivers and so actually there’s 27 combinations (and thats not taking into accout scale type preset keywords etc) but hopefully you can ignore that but appreciate the complexity involved here.

So we have three rules:

  1. If a shader outputs a bit depth (last shader pass format) that is lower than the swap chains then RetroArch will step in and convert for you with a hidden pass.
  2. If a shader outputs a bit depth that is equal to the swap chains then it will leave it up to you as you will be writing directly to the swap chain.
  3. If a shader outputs a bit depth that is higher than the swap chains then it will ignore your request and use the swap chains format and then follow rule 2.

So why?

  • Rule 1 allows us to upgrade legacy shaders to HDR
  • Rule 2 allows us to directly write to the swapchain instead of incurring an extra hidden conversion and copy - important for low end/portable/power constrained devices, accuracy and shader author control
  • Rule 3 allows us to have a single shader that supports all three swap chains

In order to use rule 3 you need to declare the final pass format as RGBA16 float and then use the ‘HDRMode’ shader constant to do the necessary conversions OR use the hdr\shaders\hdr.slang by adding it as the final pass to your shader.

Here is the combination table (swapchain formats across the top, shader output formats down the side) that hopefully tells you what you need to do in what scenario:

Shader Output \ Swapchain r709 sRGB 8-bit UNORM (B8G8R8A8) HDR10 PQ 10-bit UNORM (A2B10G10R10) scRGB Linear 16-bit float (R16G16B16A16F)
8-bit UNORM (SDR sRGB) Passthrough RA converts: sRGB decode → InverseTonemap → 709→2020 → PQ encode RA converts: sRGB decode → scale by MaxNits/80
10-bit UNORM (HDR10 PQ) Ignored — uses 8-bit, passthrough. Shader must output SDR sRGB Passthrough — shader must output PQ Rec.2020 RA converts: PQ decode → 2020→709 → ×125
16-bit float Ignored — uses 8-bit, passthrough. Shader must output SDR sRGB Ignored — uses 10-bit, passthrough. Shader must output PQ Rec.2020 Passthrough — shader must output linear Rec.709 scRGB (1.0 = 80 nits)
3 Likes

The swap chain has a colorspace, yes, but we don’t actually care about that. We don’t need to abide by the swapchain format because the GPU will just put out whatever data we send it. It’s just a convention. We can use whatever three-channel colorspace we want as long as the monitor can be set to accept it. You can even output non-RGB, like YCbCr. The user is responsible for selecting the right format.

What does matter is the swapchain format which is the primary point of this discussion: there is #pragma format which implies to the shader developer that this is the format of the shader pass, and when that format doesn’t match the swapchain format, another pass has to be added at the end. This is not intuitive, not mentioned anywhere in documentation. So, I am trying to update the SLANG format document so that it’s easier to use. Many things are missing, there’s a lot to cover. That’s my first priority: just understanding how the shader system works as is and making that more accessible for people.

Years ago I asked about 10-bit output support for WCG in RA as feature request. RA developers did not even know, just basically said ‘we support HDR now, so 10-bit in SDR probably works’ except that’s not true. So seems RA developers aren’t even aware of this behavior.

2 Likes

I prefer to say, each emulated CRT Phosphor triad.

Thanks for finally articulating this clearly because that’s exactly what is going on and it’s a miracle that it happens to align properly for slot masks, e.t.c…

and this very precise accident is why BBGGRRX or XBBGGRR doesn’t align properly with RWBG/WOLED displays.

Since this is the dawn of RGWB tandem OLED and new subpixel layouts, can you provide a simple blueprint or tips for adding new layouts or updating existing subpixel layouts in the near future?

It’s unfortunate that no one with an LG G5 has come forward to assist in testing and ensuring that they are getting properly aligned subpixel mask emulation but I guess ignorance is bliss and no one really cares much or respects the value of getting the CRT emulation right from the building blocks except us few.

For what it’s worth, I downloaded a new nightly a couple days ago and converted all of the presets in my latest Epic preset pack which is based on Sony Megatron Colour Video Monitor v1 and from the 1 or 2 presets I tested, they looked pretty much identical to how they looked before the conversion.

I might still need to reinstate my custom modifications for some of the 8K masks to get additional 4K Masks at different heights that I liked but due to the changes in the Mask code that you introduced, I think I would try to see how they would look with the new Mask code before deciding if they need any further modification.

The reason I did that in the first place was because I preferred the stubbier 8K Masks over the more elongated 4K ones but then the 8K ones were no longer RGB/RBG/BGR.

1 Like

Of course it matters: if our native colour space was HDR10 we also wouldn’t need to do anything and we’d have to do something to convert it to sRGB/r709. Games have to do this through tonemapping etc Look lets stop arguing about whether the sky is blue - it is.

1 Like

I don’t get what you mean by this. In all three cases, HDR, WCG, SDR, I have to do a conversion. I don’t get anything for free because the source data is in a different space (let’s just call it linear RGB).

I understand that when we have a final shader pass in 10-bit and an 8-bit swap-chain, we have to convert to 8-bit some how. That’s totally acceptable. But we are all saying that a 10-bit SDR swap-chain is available (albeit not to everyone), and therefore the 10-bit to 8-bit conversion in SDR should not need to happen when we could make use of that swap-chain. Further, we’ve seen how this invisible pass insertion has created bugs and user unfriendliness, so from a maintenance standpoint it makes sense for the program to try to match the swap chain with the desired format if at all possible.

Look lets start again: we select our swap chain via the Retroarch menu. Custom shaders ask for a desired render target format via the preset or #pragma format. The two requests may mismatch and we have to decide how to cope with that. In all cases you will need to output the swap chain format.

You seem to think (I could be wrong) that adding support for 10bit SDR would help with this: it wont you will still need to support HDR swapchains in your shader. In fact this will make things more difficult and no one will use it as we have HDR.

Probably poor wording on my part. Adding 10-bit SDR support (let’s just call it WCG even if that’s not perfectly correct) is a feature request. With the current detection to trigger inverse tonemapping when HDR is on, WCG shader needs to use conditional to output either gamma-corrected signal (and ends up being 8-bit anyway because RA won’t use a 10-bit swap chain if HDR is not enabled) or HDR10 signal. I think that’s acceptable, and it’s a ‘problem’ that is totally independent of whether 10-bit output happens or not. I actually just want to have WCG for its own sake. I think the confusion stems from the earlier discussion about 10-bit/16-bit format triggering HDR10 assumption unexpectedly.

The colors look wrong to visual inspection, not the analysis tools. The analysis tools just help determine what has changed and what might be going wrong.

What should be pure primaries (red, green, or blue) now fully activate multiple colors of subpixel, and all colors look look substantially different from the shaderless output (with everything set to Rec709/Accurate mode).

Actual CRTs didn’t (intentionally) activate multiple colors of phosphor on pure primaries, as is demonstrated by this post from the CRT photos thread.](Calling all CRT owners: photos please!)

RetroGames4K’s image from Rockman X3

Megatron 2.6.2 (Colour Accurate mode)

Megatron 2.7.0

Actual CRT gamuts were literally determined by the physical color of their phosphors. I suspect that the mismatch between our target color gamut primaries and our simulated phosphor primaries is at least part of the issue here.

For the record: i absolutely do not ever use AI tools intentionally, as i have yet to be convinced they are in any way fit for purpose. Insofar as i have been forced to interact with them against my will, they have done the absolute opposite of impress me, especially Google’s AI Overview, which has been wrong in some way or another literally every. single. time. it has come up whilst i was looking something up.

I remember Alien: Isolation having 10-bit SDR support because it was discussed regularly when the game came out, nearly a year before HDR10 was even announced.

(I recognize that many software developers have been impressed by AI assisted coding, and i acknowledge the possibility that it might have a real niche for that purpose specifically, but even there i remain extremely skeptical given the studies suggesting that people regularly think that AI saved them time when it actually took them longer.)

Getting things ready for Sony Megatron Colour Video Monitor v2!

This is a tribute to the shader devs who made this all possible. Know that your hard work is not in vain and is greatly appreciated!

CyberLab Megatron miniLED 4K HDR Game BFI Turbo Duo_DC Composite Sharp PVM Edition Epic CAR9x7x or CAR7x6x W4.slangp

CyberLab Megatron miniLED 4K HDR Game BFI Turbo Duo_DC Composite Shadow Mask Epic CAR9x8x or CAR7x6x W3.slangp

CyberLab MegatronV2

CyberLab Megatron V2 Presets incoming…

For those who say, they don’t know which preset to choose or they’re so many, this is why I say, read the thread.

CyberLab Megatron miniLED 4K HDR Game BFI SNES Composite PVM Edition Epic W5.slangp

CyberLab Megatron miniLED 4K HDR Game BFI SNES Composite Sharp PVM Edition Epic W4.slangp

Introducing CyberLab Megatron miniLED 4K HDR Game BFI Turbo Duo_DC S-Video CyberTron Epic CAR9x8x or CAR7x6x W4.slangp

Hope more and more folks are viewing these on their properly calibrated HDR setups now and you should definitely be viewing them at 1:1 scale or zoomed in. SDR users you need to brighten your display in order to view properly.

Bonus Content:

2 Likes

Are you updating your Online Shaders as well after updating to the new RetroArch Nightly Builds when testing? Can you also provide the presets when showing examples so that others may test and confer?

1 Like