Sony Megatron Colour Video Monitor

Really excited to see HDR updates, but unfortunately this completely breaks HDR in RetroArch on my setup. Windows 10 22H2, 5090 on 591.74, vulkan and dx11 drivers show the same super crushed high-gain look, on dx12 it doesn’t open. The HDR settings don’t change anything.

Previous implementation worked fine, except for the lack of a separate “UI Luminance” setting, making high-luminance setups like you need for BFI or CRT beam simulator extremely annoying to use with the ultra-bright UI you’d get as a result.

1 Like

Yes sadly I haven’t had the time to devote to the Sony Megatron - hopefully I’ll be able to devote a bit more time towards it now.

Great stuff, yes I might add those things features depending on the direction people want to go but then again this is a built in scanline simulation and specifically tailored to casual users who dont venture into shader land - I dont want to over load them with options.

What I’ll probably do instead is to update the Sony Megatron - I basically already have and to update it in the shader repo (I’ll rename the current one to legacy so people can keep all their old presets working as is with just a repointing to the old shader). That should hopefully meet all the demands of a power user whilst keeping the HDR menu relatively straight forward for casual users (Ok subpixel layouts and luminance levels aren’t the best).

I have no idea what Gamescope is - is that related to SteamDeck/Machine? As for compatibility hopefully we wont stop anything that was working from working in the future - there isn’t any need to.

Righty ho so this is an issue, as I’m running Windows 11 on both my main desktop machines. I have a laptop with Windows 10 I think so I’ll go away now and test it on that. In the mean time can you tell me the exact location you got your build from? Also if it was installed over the top of a previous version could you try installing to a fresh directory? Can you tell me your display setup as well - what monitor are you using?

As for the UI luminance issue really we need a brightness setting in the UI or ideally for it to just draw with paper white when HDR is enabled. It currently renders itself over the top of the end result of the HDR conversion (as it should) and I dont have any control over it. I can understand it would get eye seeringly bright on a 10,000 nit TV though.

Can you also do me a favour and run the Sony Megatron shader (crt-sony-megatron-default-hdr.slangp) located in:

C:\RetroArch\pkg\msvc\x64\Debug\shaders\shaders_slang\hdr

and tell me what it does? Using a HDR native shader like that should have different behaviour as it overrides all the settings in the menu and does its own HDR10 and inverse tonemapping etc.

Actually if you have the Sony Megatron running you wont see anything change as its a native HDR shader as in it does all the HDR10 and inverse tonemapping itself - you have to turn it off as its a to see my changes. I will update the Sony Megatron to use the menu options and use the same hdr10 conversion and inverse tonemaper shortly - I just need RetroArch to be released before I do that as otherwise it will just break the shader for end users.

So if that is the case we just have the DX12 failure - in that case can you post any logs for that build?

https://wiki.archlinux.org/title/Gamescope

Gamescope is a microcompositor that Valve has developed for games. My understanding is that basically it is a standalone Wayland instance which can be manipulated to run custom resolutions, HDR, VRR and so on.

It essentially sandboxes whichever application you run in it for efficiency and performance.

It’s great for Steam’s use case (mainly running games in Proton) but I can see it having great advantages for HTPC and emulation too.

Because it sort of ignores desktop settings (I think) it can behave oddly or unexpectedly.

1 Like

I’ll try testing it as soon as the PR is created/merged, and report the results. For some reason, the last time I tried the HDR option didn’t appear in the Linux AppImage build, only in the Flatpak build, which I couldn’t compile, but maybe that changed with the PR merged to update the RetroArch HDR system.

Thanks!

1 Like

You’re most welcome @MajorPainTheCactus. It has been a labour of love. I can’t begin to express how grateful I am to you and all other developers who have been creating these wonderful tools which bring so much joy to so many lives.

One request though. Is it possible to have some more granularity with respect to the slot mask height? for example, I started using some of the 4K Slot Mask options but I found the phosphors to be a little too tall. When I switched to 8K, I found the height to be much more ideal in that situation. The problem with the 8K masks was that they wern’t pure RGB like the 4K ones.

So my next step was to transfer the 4K masks over to the 8K section in order to keep the pure RGB effect with the shorter slot mask height that I preferred.

Also, I’m not sure if this can be helpful or not, I recently read a post by @Nesguy that stated that CRT-Guest-Advanced uses XRRGGBB masks instead of RRGGBBX.

1 Like

Additional Colour Boost options would be desirable when using HDR without any scanlines or CRT simulation whatsoever.

For example, the gba-color.slangp shader has an parameter for WCG displays (“Color Profile (1=sRGB, 2=DCI, 3=Rec2020)”), and looks best using this parameter, but that parameter is currently incompatible with the way RetroArch does HDR, forcing you to leave it set to sRGB/709 mode (matching Colour Boost “Off”), giving inferior results to those possible using WCG SDR.

Adding additional options to Colour Boost for “DCI” (P3-D65) and Rec2020 would allow such shader parameters to operate intended in HDR.

Alternatively, i suppose you could revisit your old “hdr10.slang”/“inverse_tonemap.slang” idea, and create a shader called “Retroarch Advanced HDR Options” or something like that, which could override the UI settings, and contain additional parameters for advanced users like these additional color gamut settings, and decoding gamma.

2 Likes

Ah interesting what was the reasoning behind XRRGGBB masks instead of RRGGBBX? I am going to revisit the slot masks because 5K monitors can probably do them properly with their 12 pixel height scanlines - the bar across will be proportional to the height of the scanline at that resolution - about 8.333% ish lol.

1 Like

The problem if I’ve understood you correctly is that unless you know about these things you wont have the faintest idea what option to select. For casual users in the main RetroArch menu they can understand ‘Colour Boost ON/OFF’ but I doubt they would understand ‘Colour Boost DCI/Rec2020/sRGB’ etc. Isn’t it better this keeps in the advanced section of the shaders? Dont get me wrong its a great thing to have but surely simplicity has to reign or otherwise you end up with the Sony Megatron’s rats nest of options scaring people off? lol

1 Like

Tbh, i would suggest that by that logic, the Colour Boost setting shouldn’t exist at all, for the exact same reason that you removed the Contrast setting. The “correct” Colour Boost setting for someone who doesn’t know what that setting does is always going to be Off/standard 709 primaries.

2 Likes

Thinking on it further, i would propose the following:

  • Make “Colour Boost” an advanced setting, so it is only visible when RetroArch’s “Show Advanced Settings” option is set to On.

  • Rename “Colour Boost” to “Content Colour Gamut” (or something along those lines?), rename “Off” to “sRGB/Rec709”, rename “On” to “Expanded709”.

  • Add options for at least “P3-D65” and “Rec2020”.

Proposed transformation matrices for hdr_common.glsl:

kaDisplayP3_to_2020
   0.753833034361721f, 0.198597369052617f, 0.047569596585662f,
   0.045743848965358f, 0.941777219811693f, 0.012478931222948f,
   -0.001210340354518f, 0.017601717301090f, 0.983608623053428f);

k2020_to_2020
   1.000000000000000f, 0.000000000000000f, 0.000000000000000f,
   0.000000000000000f, 1.000000000000000f, -0.000000000000000f,
   0.000000000000000f, 0.000000000000000f, 1.000000000000000f);

This would allow WCG SDR options to work correctly with RetroArch’s HDR without adding additional confusing menu options for non-advanced users.

3 Likes

I haven’t the slightest idea. Just thought I’d give you some food for thought.

I’ve learned quite a bit about slot masks over the last few months. The most important probably being that they require the most brightness to be emulated well, especially with BFI. Another thing is that they look much better when the horizontal slots near the center are evenly centered between the scanlines as well as when there are no upper and lower slots horizontal slots too close to the scanlines. For those situations, I have experimented with increasing the size of the scanline gaps until they occlude any near scanline gap horizontal slots. For the vertical centering and alignment, I use the “Vertical Offset” Shader Parameter.

The thing is, this vertical alignment changes at different Integer Scale values.

I do similar things for dot masks as sometimes it looks weird when the scanline gaps appear to cull dots leaving less than half a dot or a quarter or third or sliver of a dot at the top and bottom of a scanline. Perhaps some sort of optional clamping or snapping mechanism to prevent “unnatural” clipping of the slots and dot mask phosphors might be a useful feature.

This may not be a completely accurate feature as I’ve seen unaligned slotmasks and scanlines in real CRT photos but it doesn’t look as bad as when it’s emulated.

Some raw unedited thoughts which came to mind while analyzing and comparing the previous JVC-D Series pics to my Shader preset pics of Tyris' forearm.

Negative Glow - darkening transparency effect on the Scanlines that is lower when the Scanlines Gaps are narrower as is the case where there are bright pixels and higher when the scanline gaps are wider as is the case where there are dark pixels.

Observations, slots are sharp, edges of Scanlines are soft.

NesGuy’s grid sort of simulates this.

In the Sony Megatron unaligned example with Tyris’ forearm, both the slot and the scanline gap is sharp, only the slot should be sharp and the interaction of the mask and the scanline gap over bright areas could be modulated by adjusting the alpha transparency between the Phosphors/mask and scanline gaps.

Currently dynamics seem to be a bit 2 dimensional. What’s needed is 3 dimensional dynamics when it comes to mask/phosphors and scanline layers in order to more accurately emulate the natural blending that takes place when these different elements overlap.

…And let’s not forget the unexcited vertical Phosphor stripes. That’s a layer as well.

At no point do the scanline gaps sharply bisect the Phosphor triads in bright areas in which the scanline gaps are narrow as we’re seeing in the Tyris’ forearm example.

On a real CRT, it seems as if Phosphor Strength = Brightness x Saturation and is proportional to Beam Strength, which is inversely proportional to Scanline Gap Strength.

So on a bright pixel the scanline gap cannot get dark enough to fully occlude a Phosphor triad.

This is not the case with the slots. They are always at maximum strength as they are a passive component.

These are exactly the kinds of discussions and comparisons we’ve been having here:

This is the taller Slot Mask with slot mask and scanlines aligned and scanline gaps adjusted to cull horizontal slots appearing too close to the scanline gaps:

https://forums.libretro.com/uploads/default/original/3X/d/9/d971b3362444712fcf0a087a99749d763d3feb73.jpeg

This is the consensus @Nesguy and I have reached when it comes to Slot Masks with BFI and brightness:

8K Masks Modified to 4K Mask Layouts:

In this example the slot mask height is not as tall and overall the alignment is not as strict. similarly for the using scanline gaps to cull the horizontal slots appearing close to the scanlines it doesn’t look so bad when the scanline gaps are larger in the darker areas but it looks a bit strange on Tyris’ forearm where you can see the slot mask slots appearing close to the scanline. I think on a real CRT there would be a more gradual fade/drop off to black which might make things like that less jarring/noticeable.

So in other words what I’m trying to say is that when horizontal slot mask slots and dotmask dots appear close to thin or similarly sized scanline gaps, it produces strange moire pattterns.

https://forums.libretro.com/uploads/default/original/3X/6/a/6a3b71b29803ae7a3e5a44f5e16541429e690e7d.jpeg

Proposal: Add a 1440p Display's Resolution section and replace the 8K section with additional 4K Masks with similar height as the existing 8K Masks or revamp the Mask/TVL selection to show the actual Mask Layouts for more advanced users to have a little more control and transparency.

I think 1440p users might be feeling a bit lost or left out when there isn’t a 1440p selection in the Display’s Resolution section while 1440p is one of the most common PC Display Resolutions and it does a great job with many mask layouts and TVLs and some of the highest refresh rate monitors are of the 1440p variety.

CyberLab Death To Pixels Shader Preset Packs

You have to check out the Shader Stack I use in my latest “CyberLab Megatron miniLED Epic Death To Pixels 4K HDR Shader Preset Pack 31-12-25”.

It adds quite a few quality of life features via additional shaders:

It has the awesome CRT-Guest-Advanced-NTSC section with features like Afterglow, RF Noise and Font Preservation thanks to the amazing work of @guest.r !

It uses a modified IMG-Mod shader by @HunterK, which adds built-in support for proper overscan mask cropping and automatic recentering, along with film grain and rounded corners. I recently replaced the film grain version with one that doesn’t include film grain in favour of CRT-Guest-Advance’s RF Noise.

It also includes the full Grade, which was needed for things like the Sega Genesis/Mega Drive Luma Fix and Pallete as well as the SMS Blue Fix.

Last but not least, it integrates XBR-LVL-2 for subtle smoothing which helps bridge the gap between analog CRT displays and their “natural” anti-aliasing and modern digital displays which can sometimes be either a bit too sharp and aliased or a bit blurry.

The only optional feature that I might have liked to see that’s missing is support for reflective bezels and overlays, internal handling of scaling and aspect ratios and maybe CRT-Beam-Simulator Support but I have never gotten that to work properly on my display so I stick with my TV’s built-in BFI/Strobing.

These are examples of what the stack is capable of in the right hands:

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

Most recently I’ve been focusing on making things brighter for “normal” folks.

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

CyberLab Death To Pixels Shader Preset Packs

2 Likes

Yes it’s absolutely a fair point but there is a kind of a slightly washed out look when displaying SDR content on a HDR screen that Microsoft was specifically trying to fix with this expanded gamut matrix. I kind of agree with them and it does give the scanline simulation more of a colour ‘pop’ although incorrect to the original.

1 Like

Look I totally agree that this is technically the correct thing to do but if I ask the casual user what ‘Content Colour Gamut’ means Im sure 95%(99%?) wouldn’t have the faintest idea - its the reason why I renamed ‘Expand Gamut’ to ‘Colour Boost’. The reason its in there is because it was in there from the start, if I started fresh today maybe I wouldnt add it or hide it in advanced settings but its there purely for the users who dont care and just like a bit of a colour boost. Im open to being swayed on the matter though.

1 Like

Yes the whole thing about 5k displays and slot masks is that you can have them centered on the scanline and be about the right height proportionally to the scanline - ita the perfect resolution for them.

1 Like

I’m with you on this.

If you’re open to being swayed on it then I say leave it be. It’s an option. No one is forcing anyone to use it and different users have different tastes as well as different setups. How many users even had colourimitry tools and calibrated CRTs growing up?

Many if not most of my presets including some of my most recent ones use this expanded colour gamut mainly because I use my eyes to calibrate, I like to experiment and it simply looks better to me.

I don’t think it should be hidden from casual users either nor advanced users.

If it’s removed or deprecated, I can see myself having to abuse the saturation parameter to compensate. Which would probably be less than ideal.

Variety is the spice of life.

1 Like

Hmm… perhaps keeping the basic setting as Colour Boost and giving the new gamuts nicknames would work? Say, Off=709 (same as now), On=Expanded709 (same as now), Balanced=P3-D65, and Extreme=Rec2020? That way the options are there for advanced users to make WCG SDR shaders work correctly in HDR, while more casual users get the knock-on benefit of additional “poptions” in a form that is easier to parse?

(It could also be Off/Light/Full/Extreme, or anything else. The gamuts being available and what the setting actually does being documented for advanced users is the important thing.)

What it ultimately comes down to is that it is currently impossible to make gba-color.slangp and other similar shaders work correctly to their full potential with HDR enabled. You have to disable HDR and use WCG SDR instead.

As i see it, the available options to fix this are:

a) Adding those additional gamuts to Colour Boost, regardless of how we name those settings.

b) Making a bespoke “Advanced HDR Options” shader that overwrites the Colour Boost option with those gamuts available as a parameter.

c) Making it possible to turn off scanlines in Megatron like you can in guest-advanced, in which case Megatron would do double duty as the hypothetical “Advanced HDR Options” shader from option b.

I don’t ultimately care how we get there, but I feel very strongly that one of those options needs to happen in some form to make RetroArch’s HDR support more feature complete.

2 Likes