Sony Megatron Colour Video Monitor

Good evening everyone, long time no see. I hope you’re all doing well.

Ok, so 2026 is going to be a big year for displays if CES 2026 is anything to go by. We saw TCL show off 9000-nit TVs, Samsung and LG show off proper RGB subpixel technology for QD-OLEDs and OLEDs respectively, and of course, we saw Nvidia release G-Sync Pulsar—the ultimate conclusion to LCD backlight strobing for motion clarity.

With all that in mind, it gave me enough enthusiasm (along with a recent Retro Crisis video showing the pain of setting up HDR) to revisit my HDR implementation and upgrade it.

I set out with a number of goals, which I have now implemented:

1. Simplify the end user experience:

Previously, users had a “three-body problem” to solve in the HDR menu by balancing Peak Luminance, Paper White Luminance, and Contrast. I have done away with Contrast as it is not needed. Users now just set a one-time Peak Luminance value and from then on just use Paper White as their overall brightness control.

Also, previously when users went into native HDR shaders, they were expected to change yet another set of shader-internal parameters that seemingly conflicted with the menu settings, adding another layer of confusion. I’ve exposed the HDR menu settings to the custom shaders so this should no longer be an issue, once the custom shaders have been updated to take advantage of the new features (I will be updating Sony Megatron soon!).

2. Improve quality:

It’s been noted for a long time that there are better ways to do HDR. The previous implementation in RetroArch actually dated back nearly a decade (although it didn’t appear in RetroArch until four years ago). That was in the early days of HDR, and different constraints based on displays at the time were taken into account that now don’t exist.

The old implementation had an artificial “elbow” in the mapping: below 0.5, the SDR image was mapped linearly to HDR space, and above 0.5, the highlights were expanded out to the rest of the HDR range. This effectively only dealt with highlights. I have updated this to remove the elbow and simply map the entire SDR space into the available HDR space, handling shadows and mid-tones much better. Whether you’ll really notice this in retro gaming is another matter, but it is definitely an improvement.

I also fixed the way we scaled linear color to determine how much to boost a color. Previously, I used a luma coefficient to get a grayscale value; this resulted in reds, for example, not being boosted as much as they should and whites being over-boosted, causing a washed-out look. We now use RGB maxing, which is a much better solution.

3. Future-proofing for efficient hardware:

I believe the future of this tech for retro gaming lies in cheaper devices paired with more expensive displays—i.e., most people are willing to spend an awful lot on the TV in their living room rather than the PC in their bedroom (although looking at memory and 5090 prices, that isn’t universally true!).

To that end, I wanted to target Raspberry Pi 5 level hardware that you can put behind your Samsung or LG OLED in the lounge. To achieve this, I’ve updated the RetroArch HDR system to be as efficient as possible by offloading as much processing to the display as possible. I’ve included a fast, single-pass scanline simulation for HDR aimed at running close to 60fps at 4K.

My aim is to bring this HDR implementation to LakkaTV on the Pi 5. The guys in charge of LakkaTV and the OS it sits on have done most of the heavy lifting, and we just need RetroArch to pass HDR metadata to it. I’ll hopefully get the time soon to add that specific support, but in the meantime, the core RetroArch HDR system is now updated and ready.

So, two things are to come on top of the large update to RetroArch:

  1. The Sony Megatron shader is going to get an upgrade to its scanline simulation, for better HDR blooming and, in addition, better inverse tonemapping and HDR10 support. Hopefully, there will be a good performance boost and the shader parameters will be simplified.

  2. HDR support on Linux through the KMS/DRM layer and the Vulkan driver rather than the desktop Wayland/X11 (the tech used by the Steam Deck/SteamOS). This has actually mostly already been done by others we just need to pass through the vulkan hdr metadata correctly and enable HDR.

Anyway, lots to do!

7 Likes

Here’s the merged pull request with some more description of the changes contained inside it:

https://github.com/libretro/RetroArch/pull/18616

3 Likes

That’s great news, thank you for your work and for these amazing shaders! I started using the Sony Megatron shaders only recently, after trying for days to fine-tune the parameters to make it look good on my cheap monitor (1080p, only ~300nits, but it looks good if I use aperture grille as mask and smooth the scanlines), and even so I’m still surprised at how good it looks, I can get close and see the details of the mask simulation.

My aim is to bring this HDR implementation to LakkaTV on the Pi 5. The guys in charge of LakkaTV and the OS it sits on have done most of the heavy lifting, and we just need RetroArch to pass HDR metadata to it. I’ll hopefully get the time soon to add that specific support, but in the meantime, the core RetroArch HDR system is now updated and ready.

Is that related to this issue? https://github.com/libretro/RetroArch/issues/18186

I made some comments there when we are trying to debug it, HDR seems broken on both Linux and Android, although according to some comments there, it depends on the quality of the screen/HDR support the user has, so due to my current monitor, my experience and initial attempts using it were very bad, so I’m using it in SDR for now.

1 Like

Welcome back! Good to see you back at it.

When users go to RTINGS they see multiple Peak Luminance values as it relates to different window sizes so it’s a bit confusing. Which Window Size are we to use when we go to RTINGS to determine our display’s Peak Luminance value to use in our RetroArch or Sony Megatron Colour Video Monitor setup? Is it 2% Window Peak Brightness, 5% Window Peak Brightness, 10% Window Peak Brightness, 25% Window Peak Brightness…e.t.c.?

On another note, there seems to be a showstopping bug since AMD upfated their Vulkan Drivers:

CRT-Guest-Advanced users using supposedly similar Masks and TVL’s are not expering this, nor DX11/12 Video Driver users.

Thanks again for your excellent work on RetroArch and your amazing shader(s)!

1 Like

Hi @MatheusWillder, Im glad you got it working to a satisfactory level.

With regards to that post that’s exactly it. I believe the underlying OS, driver and hardware is all supported the only sticking point is the Pi 5 hdmi port spec for 60fps (haven’t looked into it). So all that should need setting is that HDR metadata. I got HDR working on a personal build of Android RetroArch and it kicks and screams a bit saying it doesn’t support rgb10a2 but if you swap it over to rgba16 its fine. I’m sure different devices will support different things. Whether the Pi 5 will play ball is another question but we can only try.

1 Like

Hi Cyber, firstly thanks for all your hard work and effort over the years its greatly appreciated by me (and I’m sure the community at large).

When users go to RTINGS they see multiple Peak Luminance values as it relates to different window sizes so it’s a bit confusing. Which Window Size are we to use when we go to RTINGS to determine our display’s Peak Luminance value to use in our RetroArch or Sony Megatron Colour Video Monitor setup? Is it 2% Window Peak Brightness, 5% Window Peak Brightness, 10% Window Peak Brightness, 25% Window Peak Brightness…e.t.c.?

Yes its a very good point and sadly doesn’t have a simple answer. The general rule of thumb is that OLEDs should look at 2-5% Peak Luminance figures and LCD (backlit LEDs) should look towards the 50-100% Sustained Luminance figures.

Why?

Well that’s a bit more interesting. For a OLED it can turn on/off each subpixel as in it doesn’t use power for off subpixels and so they don’t contribute to the ABL (Auto Brightness Limiter) power limits. This works great for the Sony Megatron with its strict full mask and scanline simulation as most of the sub pixels are turned off. See this table of calculations:

Component Efficiency Cumulative Percentage
Native Panel 100% 100%
RGBX Mask (3/12 subpixels) 25% 25%
Bezier Scanlines (45% AVL) 45% 11.25%
G-Sync Pulsar (Strobing)/BFI@240hz 25% 2.8%

So if you ran the Sony Megatron just on its own on a full white screen you’d get 11.25% of the pixels lit up and if you have BFI on then its just 2.8% of them turned on over a four frame cycle. Hence 2-5% Peak Luminance is the metric to look at for OLEDs.

For MiniLED LCD’s though its a different story they use dimming zones i.e a group of LEDs lighting up a section of LCDs and its these LEDs that use the power and cause the heat. As such they cant take advantage of turning off sub pixels and so a full white screen with the Sony Megatron uses the same power as a full white screen without the Sony Megatron masks and scanlines. Hence for backlit LCD users they need to look at the 50-100% Sustained Luminance figures.

Luckily it just so happens that LCD’s are much better at sustained luminance than OLEDs are and so in a lot of cases LCD’s at 100-50% sustained are still brighter than OLEDs at 2-5% Peak. But the Sony Megatron certainly narrows the brightness gap between the two technologies drastically.

Just out of curiosity I plugged in the above figures to see what would be required to simulate various CRT displays would require to simulate on a OLED/LCD with backlight strobing/BFI turned on for motion clarity and we’re looking at these kinds of figures (its worth noting nits in human perception of brightness, scale logarithmically i.e exponentially higher nits are required for ever smaller perceptual brightness increases - i.e. gamma 2.2 curve):

  • To match a 100-nit BVM: You need 3,571 nits .[2]

  • To match a 150-nit “Punchy” BVM: You need 5,357 nits .[2]

  • To match an “HDR CRT” (400-nit peaks): You need 14,285 nits .[2]

3 Likes

As for the Vulkan AMD driver issue its something to do with the drivers messing up the output. Sony Megatron is pretty simple it just turns off sub pixels - if the driver does some kind of reconstruction/compression/fancy new ai feature its going to wreck the Sony Megatron and to an extent all other output. Sadly this is one for AMD as I cant do anything about it. Maybe post on Phoronix forums - they do a lot of work with AMD Vulkan drivers see if they know if its a known issue.

1 Like

Welcome back! Very excited to see these updates in action.

(Just as well that i hadn’t gotten around to actually posting that fork topic xD)

2 Likes

This is an absolutely massive night and day upgrade. Well done and thank you for your efforts. I eagerly await digging into a version of Megatron that includes these updates.

In case anyone is wondering, with this update:

  • Setting both Peak Luminance and Paper white to 100cd/m2 matches SDR baseline.

  • Colour Boost “Off” uses Rec709/sRGB primaries. “On” stretches the gamut to Expanded709.

@MajorPainTheCactus Could you consider adding additional options to Colour Boost for P3-D65 and Rec2020 primaries? This would allow non-HDR aware shaders with WCG options to work correctly.

(Rec601 and PAL/EBU would be nice as well, but admittedly wouldn’t really fall under the definition of “boost” xD)

2 Likes

Whats the impact of this for HDR users who run Retroarch in Gamescope? Steam Deck and SteamOS is growing in popularity; and with the advent of the Steam Machine we may see even more users running Retroarch (or solutions like Retrodeck/Emudeck). Most (including myself) will be running SteamOS or a fork like Bazzite. Will HDR still work after this change to KMS/DRM Layer?

Excited to see the new updates to the shader!

1 Like

Really excited to see HDR updates, but unfortunately this completely breaks HDR in RetroArch on my setup. Windows 10 22H2, 5090 on 591.74, vulkan and dx11 drivers show the same super crushed high-gain look, on dx12 it doesn’t open. The HDR settings don’t change anything.

Previous implementation worked fine, except for the lack of a separate “UI Luminance” setting, making high-luminance setups like you need for BFI or CRT beam simulator extremely annoying to use with the ultra-bright UI you’d get as a result.

1 Like

Yes sadly I haven’t had the time to devote to the Sony Megatron - hopefully I’ll be able to devote a bit more time towards it now.

Great stuff, yes I might add those things features depending on the direction people want to go but then again this is a built in scanline simulation and specifically tailored to casual users who dont venture into shader land - I dont want to over load them with options.

What I’ll probably do instead is to update the Sony Megatron - I basically already have and to update it in the shader repo (I’ll rename the current one to legacy so people can keep all their old presets working as is with just a repointing to the old shader). That should hopefully meet all the demands of a power user whilst keeping the HDR menu relatively straight forward for casual users (Ok subpixel layouts and luminance levels aren’t the best).

I have no idea what Gamescope is - is that related to SteamDeck/Machine? As for compatibility hopefully we wont stop anything that was working from working in the future - there isn’t any need to.

Righty ho so this is an issue, as I’m running Windows 11 on both my main desktop machines. I have a laptop with Windows 10 I think so I’ll go away now and test it on that. In the mean time can you tell me the exact location you got your build from? Also if it was installed over the top of a previous version could you try installing to a fresh directory? Can you tell me your display setup as well - what monitor are you using?

As for the UI luminance issue really we need a brightness setting in the UI or ideally for it to just draw with paper white when HDR is enabled. It currently renders itself over the top of the end result of the HDR conversion (as it should) and I dont have any control over it. I can understand it would get eye seeringly bright on a 10,000 nit TV though.

Can you also do me a favour and run the Sony Megatron shader (crt-sony-megatron-default-hdr.slangp) located in:

C:\RetroArch\pkg\msvc\x64\Debug\shaders\shaders_slang\hdr

and tell me what it does? Using a HDR native shader like that should have different behaviour as it overrides all the settings in the menu and does its own HDR10 and inverse tonemapping etc.

Actually if you have the Sony Megatron running you wont see anything change as its a native HDR shader as in it does all the HDR10 and inverse tonemapping itself - you have to turn it off as its a to see my changes. I will update the Sony Megatron to use the menu options and use the same hdr10 conversion and inverse tonemaper shortly - I just need RetroArch to be released before I do that as otherwise it will just break the shader for end users.

So if that is the case we just have the DX12 failure - in that case can you post any logs for that build?

https://wiki.archlinux.org/title/Gamescope

Gamescope is a microcompositor that Valve has developed for games. My understanding is that basically it is a standalone Wayland instance which can be manipulated to run custom resolutions, HDR, VRR and so on.

It essentially sandboxes whichever application you run in it for efficiency and performance.

It’s great for Steam’s use case (mainly running games in Proton) but I can see it having great advantages for HTPC and emulation too.

Because it sort of ignores desktop settings (I think) it can behave oddly or unexpectedly.

1 Like

I’ll try testing it as soon as the PR is created/merged, and report the results. For some reason, the last time I tried the HDR option didn’t appear in the Linux AppImage build, only in the Flatpak build, which I couldn’t compile, but maybe that changed with the PR merged to update the RetroArch HDR system.

Thanks!

1 Like

You’re most welcome @MajorPainTheCactus. It has been a labour of love. I can’t begin to express how grateful I am to you and all other developers who have been creating these wonderful tools which bring so much joy to so many lives.

One request though. Is it possible to have some more granularity with respect to the slot mask height? for example, I started using some of the 4K Slot Mask options but I found the phosphors to be a little too tall. When I switched to 8K, I found the height to be much more ideal in that situation. The problem with the 8K masks was that they wern’t pure RGB like the 4K ones.

So my next step was to transfer the 4K masks over to the 8K section in order to keep the pure RGB effect with the shorter slot mask height that I preferred.

Also, I’m not sure if this can be helpful or not, I recently read a post by @Nesguy that stated that CRT-Guest-Advanced uses XRRGGBB masks instead of RRGGBBX.

1 Like

Additional Colour Boost options would be desirable when using HDR without any scanlines or CRT simulation whatsoever.

For example, the gba-color.slangp shader has an parameter for WCG displays (“Color Profile (1=sRGB, 2=DCI, 3=Rec2020)”), and looks best using this parameter, but that parameter is currently incompatible with the way RetroArch does HDR, forcing you to leave it set to sRGB/709 mode (matching Colour Boost “Off”), giving inferior results to those possible using WCG SDR.

Adding additional options to Colour Boost for “DCI” (P3-D65) and Rec2020 would allow such shader parameters to operate intended in HDR.

Alternatively, i suppose you could revisit your old “hdr10.slang”/“inverse_tonemap.slang” idea, and create a shader called “Retroarch Advanced HDR Options” or something like that, which could override the UI settings, and contain additional parameters for advanced users like these additional color gamut settings, and decoding gamma.

2 Likes