Hi Cyber, firstly thanks for all your hard work and effort over the years its greatly appreciated by me (and I’m sure the community at large).
When users go to RTINGS they see multiple Peak Luminance values as it relates to different window sizes so it’s a bit confusing. Which Window Size are we to use when we go to RTINGS to determine our display’s Peak Luminance value to use in our RetroArch or Sony Megatron Colour Video Monitor setup? Is it 2% Window Peak Brightness, 5% Window Peak Brightness, 10% Window Peak Brightness, 25% Window Peak Brightness…e.t.c.?
Yes its a very good point and sadly doesn’t have a simple answer. The general rule of thumb is that OLEDs should look at 2-5% Peak Luminance figures and LCD (backlit LEDs) should look towards the 50-100% Sustained Luminance figures.
Why?
Well that’s a bit more interesting. For a OLED it can turn on/off each subpixel as in it doesn’t use power for off subpixels and so they don’t contribute to the ABL (Auto Brightness Limiter) power limits. This works great for the Sony Megatron with its strict full mask and scanline simulation as most of the sub pixels are turned off. See this table of calculations:
Component
Efficiency
Cumulative Percentage
Native Panel
100%
100%
RGBX Mask (3/12 subpixels)
25%
25%
Bezier Scanlines (45% AVL)
45%
11.25%
G-Sync Pulsar (Strobing)/BFI@240hz
25%
2.8%
So if you ran the Sony Megatron just on its own on a full white screen you’d get 11.25% of the pixels lit up and if you have BFI on then its just 2.8% of them turned on over a four frame cycle. Hence 2-5% Peak Luminance is the metric to look at for OLEDs.
For MiniLED LCD’s though its a different story they use dimming zones i.e a group of LEDs lighting up a section of LCDs and its these LEDs that use the power and cause the heat. As such they cant take advantage of turning off sub pixels and so a full white screen with the Sony Megatron uses the same power as a full white screen without the Sony Megatron masks and scanlines. Hence for backlit LCD users they need to look at the 50-100% Sustained Luminance figures.
Luckily it just so happens that LCD’s are much better at sustained luminance than OLEDs are and so in a lot of cases LCD’s at 100-50% sustained are still brighter than OLEDs at 2-5% Peak. But the Sony Megatron certainly narrows the brightness gap between the two technologies drastically.
Just out of curiosity I plugged in the above figures to see what would be required to simulate various CRT displays would require to simulate on a OLED/LCD with backlight strobing/BFI turned on for motion clarity and we’re looking at these kinds of figures (its worth noting nits in human perception of brightness, scale logarithmically i.e exponentially higher nits are required for ever smaller perceptual brightness increases - i.e. gamma 2.2 curve):
As for the Vulkan AMD driver issue its something to do with the drivers messing up the output. Sony Megatron is pretty simple it just turns off sub pixels - if the driver does some kind of reconstruction/compression/fancy new ai feature its going to wreck the Sony Megatron and to an extent all other output. Sadly this is one for AMD as I cant do anything about it. Maybe post on Phoronix forums - they do a lot of work with AMD Vulkan drivers see if they know if its a known issue.
This is an absolutely massive night and day upgrade. Well done and thank you for your efforts. I eagerly await digging into a version of Megatron that includes these updates.
In case anyone is wondering, with this update:
Setting both Peak Luminance and Paper white to 100cd/m2 matches SDR baseline.
Colour Boost “Off” uses Rec709/sRGB primaries. “On” stretches the gamut to Expanded709.
@MajorPainTheCactus Could you consider adding additional options to Colour Boost for P3-D65 and Rec2020 primaries? This would allow non-HDR aware shaders with WCG options to work correctly.
(Rec601 and PAL/EBU would be nice as well, but admittedly wouldn’t really fall under the definition of “boost” xD)
Whats the impact of this for HDR users who run Retroarch in Gamescope? Steam Deck and SteamOS is growing in popularity; and with the advent of the Steam Machine we may see even more users running Retroarch (or solutions like Retrodeck/Emudeck). Most (including myself) will be running SteamOS or a fork like Bazzite. Will HDR still work after this change to KMS/DRM Layer?
Really excited to see HDR updates, but unfortunately this completely breaks HDR in RetroArch on my setup. Windows 10 22H2, 5090 on 591.74, vulkan and dx11 drivers show the same super crushed high-gain look, on dx12 it doesn’t open. The HDR settings don’t change anything.
Previous implementation worked fine, except for the lack of a separate “UI Luminance” setting, making high-luminance setups like you need for BFI or CRT beam simulator extremely annoying to use with the ultra-bright UI you’d get as a result.
Great stuff, yes I might add those things features depending on the direction people want to go but then again this is a built in scanline simulation and specifically tailored to casual users who dont venture into shader land - I dont want to over load them with options.
What I’ll probably do instead is to update the Sony Megatron - I basically already have and to update it in the shader repo (I’ll rename the current one to legacy so people can keep all their old presets working as is with just a repointing to the old shader). That should hopefully meet all the demands of a power user whilst keeping the HDR menu relatively straight forward for casual users (Ok subpixel layouts and luminance levels aren’t the best).
I have no idea what Gamescope is - is that related to SteamDeck/Machine? As for compatibility hopefully we wont stop anything that was working from working in the future - there isn’t any need to.
Righty ho so this is an issue, as I’m running Windows 11 on both my main desktop machines. I have a laptop with Windows 10 I think so I’ll go away now and test it on that. In the mean time can you tell me the exact location you got your build from? Also if it was installed over the top of a previous version could you try installing to a fresh directory? Can you tell me your display setup as well - what monitor are you using?
As for the UI luminance issue really we need a brightness setting in the UI or ideally for it to just draw with paper white when HDR is enabled. It currently renders itself over the top of the end result of the HDR conversion (as it should) and I dont have any control over it. I can understand it would get eye seeringly bright on a 10,000 nit TV though.
and tell me what it does? Using a HDR native shader like that should have different behaviour as it overrides all the settings in the menu and does its own HDR10 and inverse tonemapping etc.
Actually if you have the Sony Megatron running you wont see anything change as its a native HDR shader as in it does all the HDR10 and inverse tonemapping itself - you have to turn it off as its a to see my changes. I will update the Sony Megatron to use the menu options and use the same hdr10 conversion and inverse tonemaper shortly - I just need RetroArch to be released before I do that as otherwise it will just break the shader for end users.
So if that is the case we just have the DX12 failure - in that case can you post any logs for that build?
Gamescope is a microcompositor that Valve has developed for games. My understanding is that basically it is a standalone Wayland instance which can be manipulated to run custom resolutions, HDR, VRR and so on.
It essentially sandboxes whichever application you run in it for efficiency and performance.
It’s great for Steam’s use case (mainly running games in Proton) but I can see it having great advantages for HTPC and emulation too.
Because it sort of ignores desktop settings (I think) it can behave oddly or unexpectedly.
I’ll try testing it as soon as the PR is created/merged, and report the results. For some reason, the last time I tried the HDR option didn’t appear in the Linux AppImage build, only in the Flatpak build, which I couldn’t compile, but maybe that changed with the PR merged to update the RetroArch HDR system.
You’re most welcome @MajorPainTheCactus. It has been a labour of love. I can’t begin to express how grateful I am to you and all other developers who have been creating these wonderful tools which bring so much joy to so many lives.
One request though. Is it possible to have some more granularity with respect to the slot mask height? for example, I started using some of the 4K Slot Mask options but I found the phosphors to be a little too tall. When I switched to 8K, I found the height to be much more ideal in that situation. The problem with the 8K masks was that they wern’t pure RGB like the 4K ones.
So my next step was to transfer the 4K masks over to the 8K section in order to keep the pure RGB effect with the shorter slot mask height that I preferred.
Also, I’m not sure if this can be helpful or not, I recently read a post by @Nesguy that stated that CRT-Guest-Advanced uses XRRGGBB masks instead of RRGGBBX.
Additional Colour Boost options would be desirable when using HDR without any scanlines or CRT simulation whatsoever.
For example, the gba-color.slangp shader has an parameter for WCG displays (“Color Profile (1=sRGB, 2=DCI, 3=Rec2020)”), and looks best using this parameter, but that parameter is currently incompatible with the way RetroArch does HDR, forcing you to leave it set to sRGB/709 mode (matching Colour Boost “Off”), giving inferior results to those possible using WCG SDR.
Adding additional options to Colour Boost for “DCI” (P3-D65) and Rec2020 would allow such shader parameters to operate intended in HDR.
Alternatively, i suppose you could revisit your old “hdr10.slang”/“inverse_tonemap.slang” idea, and create a shader called “Retroarch Advanced HDR Options” or something like that, which could override the UI settings, and contain additional parameters for advanced users like these additional color gamut settings, and decoding gamma.
Ah interesting what was the reasoning behind XRRGGBB masks instead of RRGGBBX? I am going to revisit the slot masks because 5K monitors can probably do them properly with their 12 pixel height scanlines - the bar across will be proportional to the height of the scanline at that resolution - about 8.333% ish lol.
The problem if I’ve understood you correctly is that unless you know about these things you wont have the faintest idea what option to select. For casual users in the main RetroArch menu they can understand ‘Colour Boost ON/OFF’ but I doubt they would understand ‘Colour Boost DCI/Rec2020/sRGB’ etc. Isn’t it better this keeps in the advanced section of the shaders? Dont get me wrong its a great thing to have but surely simplicity has to reign or otherwise you end up with the Sony Megatron’s rats nest of options scaring people off? lol
Tbh, i would suggest that by that logic, the Colour Boost setting shouldn’t exist at all, for the exact same reason that you removed the Contrast setting. The “correct” Colour Boost setting for someone who doesn’t know what that setting does is always going to be Off/standard 709 primaries.
I haven’t the slightest idea. Just thought I’d give you some food for thought.
I’ve learned quite a bit about slot masks over the last few months. The most important probably being that they require the most brightness to be emulated well, especially with BFI. Another thing is that they look much better when the horizontal slots near the center are evenly centered between the scanlines as well as when there are no upper and lower slots horizontal slots too close to the scanlines. For those situations, I have experimented with increasing the size of the scanline gaps until they occlude any near scanline gap horizontal slots. For the vertical centering and alignment, I use the vertrical Offset Shader Parameter.
The thing is, this vertical alignment changes at different Integer Scale values.
I do similar things for dot masks as sometimes it looks weird when the scanline gaps appear to cull dots leaving less than half a dot or a quarter or third or sliver of a dot at the top and bottom of a scanline. Perhaps some sort of optional clamping or snapping mechanism to prevent “unnatural” clipping of the slots and dot mask phosphors might be a useful feature.
This may not be a completely accurate feature as I’ve seen unaligned slotmasks and scanlines in real CRT photos but it doesn’t look as bad as when it’s emulated.
These are exactly the kinds of discussions and comparisons we’ve been having here:
This is the taller Slot Mask with slot mask and scanlines aligned and scanline gaps adjusted to cull horizontal slots appearing too close to the scanline gaps:
This is the consensus @Nesguy and I have reached when it comes to Slot Masks with BFI and brightness:
8K Masks Modified to 4K Mask Layouts:
In this example the slot mask height is not as tall and overall the alignment is not as strict. similarly for the using scanline gaps to cull the horizontal slots appearing close to the scanlines it doesn’t look so bad when the scanline gaps are larger in the darker areas but it looks a bit strange on Tyris’ forearm where you can see the slot mask slots appearing close to the scanline. I think on a real CRT there would be a more gradual fade/drop off to black which might make things like that less jarring/noticeable.
So in other words what I’m trying to say is that when horizontal slot mask slots and dotmask dots appear close to thin or similarly sized scanline gaps, it produces strange moire pattterns.
Proposal: Add a 1440p Display’s Resolution section and replace the 8K section with additional 4K Masks with similar height as the existing 8K Masks or revamp the Mask/TVL selection to show the actual Mask Layouts for more advanced users to have a little more control and transparency.
I think 1440p users might be feeling a bit lost or left out when there isn’t a 1440p selection in the Display’s Resolution section while 1440p is one of the most common PC Display Resolutions and it does a great job with many mask layouts and TVLs and some of the highest refresh rate monitors are of the 1440p variety.
You have to check out the Shader Stack I use in my latest “CyberLab Megatron miniLED Epic Death To Pixels 4K HDR Shader Preset Pack 31-12-25”.
It adds quite a few quality of life features via additional shaders:
It has the awesome CRT-Guest-Advanced-NTSC section with features like Afterglow, RF Noise and Font Preservation thanks to the amazing work of @guest.r !
It uses a modified IMG-Mod shader by @HunterK, which adds built-in support for proper overscan mask cropping and automatic recentering, along with film grain and rounded corners. I recently replaced the film grain version with one that doesn’t include film grain in favour of CRT-Guest-Advance’s RF Noise.
It also includes the full Grade, which was needed for things like the Sega Genesis/Mega Drive Luma Fix and Pallete as well as the SMS Blue Fix.
Last but not least, it integrates XBR-LVL-2 for subtle smoothing which helps bridge the gap between analog CRT displays and their “natural” anti-aliasing and modern digital displays which can sometimes be either a bit too sharp and aliased or a bit blurry.
The only optional feature that I might have liked to see that’s missing is support for reflective bezels and overlays, internal handling of scaling and aspect ratios and maybe CRT-Beam-Simulator Support but I have never gotten that to work properly on my display so I stick with my TV’s built-in BFI/Strobing.
These are examples of what the stack is capable of in the right hands:
Most recently I’ve been focusing on making things brighter for “normal” folks.