Sony Megatron Colour Video Monitor

Really excited to see HDR updates, but unfortunately this completely breaks HDR in RetroArch on my setup. Windows 10 22H2, 5090 on 591.74, vulkan and dx11 drivers show the same super crushed high-gain look, on dx12 it doesn’t open. The HDR settings don’t change anything.

Previous implementation worked fine, except for the lack of a separate “UI Luminance” setting, making high-luminance setups like you need for BFI or CRT beam simulator extremely annoying to use with the ultra-bright UI you’d get as a result.

1 Like

Yes sadly I haven’t had the time to devote to the Sony Megatron - hopefully I’ll be able to devote a bit more time towards it now.

Great stuff, yes I might add those things features depending on the direction people want to go but then again this is a built in scanline simulation and specifically tailored to casual users who dont venture into shader land - I dont want to over load them with options.

What I’ll probably do instead is to update the Sony Megatron - I basically already have and to update it in the shader repo (I’ll rename the current one to legacy so people can keep all their old presets working as is with just a repointing to the old shader). That should hopefully meet all the demands of a power user whilst keeping the HDR menu relatively straight forward for casual users (Ok subpixel layouts and luminance levels aren’t the best).

I have no idea what Gamescope is - is that related to SteamDeck/Machine? As for compatibility hopefully we wont stop anything that was working from working in the future - there isn’t any need to.

Righty ho so this is an issue, as I’m running Windows 11 on both my main desktop machines. I have a laptop with Windows 10 I think so I’ll go away now and test it on that. In the mean time can you tell me the exact location you got your build from? Also if it was installed over the top of a previous version could you try installing to a fresh directory? Can you tell me your display setup as well - what monitor are you using?

As for the UI luminance issue really we need a brightness setting in the UI or ideally for it to just draw with paper white when HDR is enabled. It currently renders itself over the top of the end result of the HDR conversion (as it should) and I dont have any control over it. I can understand it would get eye seeringly bright on a 10,000 nit TV though.

Can you also do me a favour and run the Sony Megatron shader (crt-sony-megatron-default-hdr.slangp) located in:

C:\RetroArch\pkg\msvc\x64\Debug\shaders\shaders_slang\hdr

and tell me what it does? Using a HDR native shader like that should have different behaviour as it overrides all the settings in the menu and does its own HDR10 and inverse tonemapping etc.

Actually if you have the Sony Megatron running you wont see anything change as its a native HDR shader as in it does all the HDR10 and inverse tonemapping itself - you have to turn it off as its a to see my changes. I will update the Sony Megatron to use the menu options and use the same hdr10 conversion and inverse tonemaper shortly - I just need RetroArch to be released before I do that as otherwise it will just break the shader for end users.

So if that is the case we just have the DX12 failure - in that case can you post any logs for that build?

https://wiki.archlinux.org/title/Gamescope

Gamescope is a microcompositor that Valve has developed for games. My understanding is that basically it is a standalone Wayland instance which can be manipulated to run custom resolutions, HDR, VRR and so on.

It essentially sandboxes whichever application you run in it for efficiency and performance.

It’s great for Steam’s use case (mainly running games in Proton) but I can see it having great advantages for HTPC and emulation too.

Because it sort of ignores desktop settings (I think) it can behave oddly or unexpectedly.

1 Like

I’ll try testing it as soon as the PR is created/merged, and report the results. For some reason, the last time I tried the HDR option didn’t appear in the Linux AppImage build, only in the Flatpak build, which I couldn’t compile, but maybe that changed with the PR merged to update the RetroArch HDR system.

Thanks!

1 Like

You’re most welcome @MajorPainTheCactus. It has been a labour of love. I can’t begin to express how grateful I am to you and all other developers who have been creating these wonderful tools which bring so much joy to so many lives.

One request though. Is it possible to have some more granularity with respect to the slot mask height? for example, I started using some of the 4K Slot Mask options but I found the phosphors to be a little too tall. When I switched to 8K, I found the height to be much more ideal in that situation. The problem with the 8K masks was that they wern’t pure RGB like the 4K ones.

So my next step was to transfer the 4K masks over to the 8K section in order to keep the pure RGB effect with the shorter slot mask height that I preferred.

Also, I’m not sure if this can be helpful or not, I recently read a post by @Nesguy that stated that CRT-Guest-Advanced uses XRRGGBB masks instead of RRGGBBX.

1 Like

Additional Colour Boost options would be desirable when using HDR without any scanlines or CRT simulation whatsoever.

For example, the gba-color.slangp shader has an parameter for WCG displays (“Color Profile (1=sRGB, 2=DCI, 3=Rec2020)”), and looks best using this parameter, but that parameter is currently incompatible with the way RetroArch does HDR, forcing you to leave it set to sRGB/709 mode (matching Colour Boost “Off”), giving inferior results to those possible using WCG SDR.

Adding additional options to Colour Boost for “DCI” (P3-D65) and Rec2020 would allow such shader parameters to operate intended in HDR.

Alternatively, i suppose you could revisit your old “hdr10.slang”/“inverse_tonemap.slang” idea, and create a shader called “Retroarch Advanced HDR Options” or something like that, which could override the UI settings, and contain additional parameters for advanced users like these additional color gamut settings, and decoding gamma.

2 Likes

Ah interesting what was the reasoning behind XRRGGBB masks instead of RRGGBBX? I am going to revisit the slot masks because 5K monitors can probably do them properly with their 12 pixel height scanlines - the bar across will be proportional to the height of the scanline at that resolution - about 8.333% ish lol.

1 Like

The problem if I’ve understood you correctly is that unless you know about these things you wont have the faintest idea what option to select. For casual users in the main RetroArch menu they can understand ‘Colour Boost ON/OFF’ but I doubt they would understand ‘Colour Boost DCI/Rec2020/sRGB’ etc. Isn’t it better this keeps in the advanced section of the shaders? Dont get me wrong its a great thing to have but surely simplicity has to reign or otherwise you end up with the Sony Megatron’s rats nest of options scaring people off? lol

1 Like

Tbh, i would suggest that by that logic, the Colour Boost setting shouldn’t exist at all, for the exact same reason that you removed the Contrast setting. The “correct” Colour Boost setting for someone who doesn’t know what that setting does is always going to be Off/standard 709 primaries.

Thinking on it further, i would propose the following:

  • Make “Colour Boost” an advanced setting, so it is only visible when RetroArch’s “Show Advanced Settings” option is set to On.

  • Rename “Colour Boost” to “Content Colour Gamut” (or something along those lines?), rename “Off” to “sRGB/Rec709”, rename “On” to “Expanded709”.

  • Add options for at least “P3-D65” and “Rec2020”.

Proposed transformation matrices for hdr_common.glsl:

kaDisplayP3_to_2020
   0.753833034361721f, 0.198597369052617f, 0.047569596585662f,
   0.045743848965358f, 0.941777219811693f, 0.012478931222948f,
   -0.001210340354518f, 0.017601717301090f, 0.983608623053428f);

k2020_to_2020
   1.000000000000000f, 0.000000000000000f, 0.000000000000000f,
   0.000000000000000f, 1.000000000000000f, -0.000000000000000f,
   0.000000000000000f, 0.000000000000000f, 1.000000000000000f);

This would allow WCG SDR options to work correctly with RetroArch’s HDR without adding additional confusing menu options for non-advanced users.

1 Like

I haven’t the slightest idea. Just thought I’d give you some food for thought.

I’ve learned quite a bit about slot masks over the last few months. The most important probably being that they require the most brightness to be emulated well, especially with BFI. Another thing is that they look much better when the horizontal slots near the center are evenly centered between the scanlines as well as when there are no upper and lower slots horizontal slots too close to the scanlines. For those situations, I have experimented with increasing the size of the scanline gaps until they occlude any near scanline gap horizontal slots. For the vertical centering and alignment, I use the vertrical Offset Shader Parameter.

The thing is, this vertical alignment changes at different Integer Scale values.

I do similar things for dot masks as sometimes it looks weird when the scanline gaps appear to cull dots leaving less than half a dot or a quarter or third or sliver of a dot at the top and bottom of a scanline. Perhaps some sort of optional clamping or snapping mechanism to prevent “unnatural” clipping of the slots and dot mask phosphors might be a useful feature.

This may not be a completely accurate feature as I’ve seen unaligned slotmasks and scanlines in real CRT photos but it doesn’t look as bad as when it’s emulated.

These are exactly the kinds of discussions and comparisons we’ve been having here: