I am now quite certain that the current Colour System/Phosphors setting colors are being clipped and/or clamped and/or compressed to Rec 709 with Original or Expanded709 with Vivid.
It is plain as day when alt tabbing between multiple instances that adding more gamut options to the “HDR: Original/Vivid” setting results in colors that much more faithfully match Dogway’s Grade in Rec 2020 mode.
(This also confirmed my expectation that, when testing the Content Color Gamut settings in my testing fork, Mask Accurate/Colour Accurate should be set to Colour Accurate, Colour System to r709, and Phosphors to NONE.)
Ah yes thats the solution I proposed yesterday which was after this post - sorry catching up! This looks fantastic - I love Daytona - I need to try this out ASAP on my QD-OLED!
Yes but we’re talking about consoles here that produce an image - this gets us back to what I was saying at the start: the signal output by the console is gamma corrected so that the TV can just pass it through and thus it must have been the console that did the correcting in its tv output circuitry/display chip. Certainly developers werent gamma correcting graphics/textures on a snes like they did on an xb360 say. So if that is true did all the console manufacturers ignore the tv standards - again it seems odd theyd do that.
Yes, but it is the emulators/cores that are doing the correcting/encoding in this context, whether they do so 100% accurately for Megatron’s purposes or not.
Megatron is the display.
So, ideally, i would say this is what we want to be happening gamma-wise for CRT-based consoles:
The “console” (emulator/core) uses a hardware accurate encoding gamma, whatever that may be.
The “display” (Megatron) uses a CRT accurate decoding gamma (pure power gamma in the 2.1-2.5 range.)
In reality, our “console” might not be using the correct gamma encode function, but we can’t change that by decoding the incorrectly encoded image using the inverse of the correct gamma encode function.
We could add an option to use a piecewise sRGB decoding gamma i suppose? Since that would presumably be the single most common “incorrect” encoding gamma produced by emulators/cores. As a bonus, this could also be useful for some more modern PC games that look their best with piecewise sRGB.
I see the emulator/core as providing only the raw unencoded and unprocessed video source and Megatron or any other CRT Shader as having the responsibility of handling both the video output encoding stages and also the “display” functionality of simulating various CRT TVs.
In fact this is what all those NTSC filters/Shaders are about.
If this wasn’t the case and Megatron was just the “display” and everything else was done at the emulator/core level then Sega Genesis games would have blending and transparency just by “hooking up” the emulator/core to the “display”.
However this is not the case unless we add TV system filters to the output stage of the core or as shaders.
If what you say is correct then you should be able to use one preset with any emulator and have acceptable Gamma/Saturation.
In practice, if you take a preset that’s highly tuned for SNES Core output and you use it on an NES Core, it’s going to look very oversaturated.
Similarly, if you take that same SNES tuned preset and try to use it to play CPS1 games, it’s going to look a bit too bright and washed out.
So the shader has a role to play from even before the video signal leaves the box, all the way through the various TV inputs and finally for display on those virtual phosphors.
Keep in mind that we are talking about gamma specifically in this case. Emulators/cores don’t necessarily simulate NTSC signal degradation, but they absolutely aren’t producing raw output with no gamma encoding whatsoever in at least most cases.
If they were, simply hooking a PC up to a CRT wouldn’t produce such good results.
I just measured the primary colors with my i1 Pro 2 spectrophotometer and there are pretty big differences between Windows 11 Auto HDR and the built in HDR in the Megatron shader within Retroarch.
To compare, first I used W11 Auto HDR in conjunction with the Megatron Reshade shader (neutral preset) with color system set to r709 in the shader menu. My TV is set to color temperature warm1, which is still a lot cooler than 6500K when used without a shader (warm 2 measures also slightly above 7000K).
With the Retroarch Megatron shader (default HDR preset) and its own HDR tonemapping and also color system set to r709 and same TV color temp I get this:
This is exactly what I saw with my eyes before measuring, there is too much yellow in the green with the Megatron HDR tonemapping.
Also what I noticed is, that whitepoint is too far off the black body curve towards magenta. The whitepoint with Windows 11 Auto HDR is much closer to the black body curve and should also be more accurate, as my TV is set to warm 1 and should be close to 9000K.
The whitepoint with W11 Auto HDR measures around 9800K and with Megatron HDR around 6600K. I think, that 6600K is too warm for my LG OLED with color temperature set to warm 1.
Sure, the Windows 11 AutoHDR oversaturates colors more beyond Rec709, which is also not accurate, but in summary, to me the tonemapping looks just better and coherent with more pure green without shifting to yellow and a whitepoint that’s much closer to the color temperature set by the TV (when measured without the shader) and also much closer to the black body curve without the magenta shift, which otherwise gives the picture a pinkish tint, which I noticed right from the beginning using the Megatron shader with its own HDR TM.
I would love to see, that MajorPainTheCactus also compares Windows 11 Auto HDR with the Megatron HDR tonemapping, if time allows and give us some feedback what he thinks about it
I also did not forget to send you some closeup screenshots from the RGB Aperture Grille mask, so you can see that my WRGB OLED works correct with RGB subpixel layout, which you may have wondered about.
I noticed, that the reshade port of Megatron and the Retroarch Megatron shader gave me different results of the subpixel structure, which I have to investigate further.
I will later post some comparison between the two with the same settings.
What I’m more interested in is how. What settings did you use to get it to look like that? What you have demonstrated is nothing short of a milestone and it’s something that I’ve been searching for for a very long time here.
At one point it was believed that LG OLED TV subpixel structure was useless for proper CRT Mask emulation so B&W Masks were recommended and widely included in Shaders.
I researched and worked on it and stumbled across by accident that the Guest Shaders actually provide evenly structured CRT Masks/Phosphors if the BGR mask layout was used. Guest’s implementation seems to be actually RBG even though it’s called BGR.
I agitated and I advocated for something similar to be included in Sony Megatron Colour Video Monitor. I even modified the code myself to prove my case. Now we have the RWBG Mask Layout for LG OLED TVs available.
You’ve shown a sort of proof of concept, that it really is possible to get good enough looking RGB phosphors out of the LG OLED TVs so I’m really excited to have that accomplishment be followed up on and made available to the wider community and by extension myself of course.
So to me, this is like the biggest thing since sliced bread.
Which one gave the proper looking RGB Phosphors, Megatron for Reshade or Megatron for RetroArch?
I also agree with your suspicions regarding colour and tonemapping. When I first started tweaking, I could remember thinking that the shade of green in Super Mario World was a bit more “limey” than I remember and initially I found that some of my reds looked a little more rusty.
Colours do look slightly better in the Colour Accurate mode though but that turns on extra (unsightly for me) subpixels so that totally kills the illusion of having RGB Phosphor Triads like a CRT, maybe with your research, input and some tweaks the accuracy gap can be further mitigated.
I’m looking forward to this.
@MajorPainTheCactus I understand that you have some serious responsibilities IRL and challenges when it comes to spare time to work on these things, I just want to say that I hope you don’t feel pressured, overwhelmed or obligated with respect to maintaining and improving this wonderful gift you have given to the world.
For me you can take your time and as long as it takes to get to whatever it is you see us talking about here.
I wish you and your family all the best in all of your endeavours.
@Dennis1 this topic is so important to me that I have an entire thread dedicated to it here:
This RetroArch Megatron CIE graph was taken with Mask Accurate and “Display’s Subpixel Layout” set to RGB, correct?
If so, please check Colour Accurate with RWBG (OLED) as well.
I would also check RetroArch’s internal HDR settings without shaders active (use the same luminance and paper white settings you are using for Megatron with Expand Gamut turned off).
Wait, make sure Original/Vivid is set to Original in RetroArch Megatron. I think it might just be this causing the color skew, even, now that i’m taking a second look at your graph.
Original is Rec 709, Vivid is an “Expanded” 709 that pulls green and red primaries towards P3, as i showed in this post.
As you can see, there are no issues with the Aperture Grille and RGB subpixel layout.
To my surprise, I discovered, that the Virtua Fighter preset has a very nice slot mask, which I was not getting with the Megatron Default Preset.
Despite the blue subpixels slightly overlapping the green subpixels, resulting in a very fine cyan colored line, this looks great to me! I thought I could only use Aperture Grille with the Megatron shader due to the weird looking Slot Mask in the Default Preset, but this changed my mind. You just have to compare all presets and see what works best here. I read before, that other users had also issues with the weirld looking Slot Mask, so I would suggest to try a different Preset such as the Virtua Fighter one.
Also the maximum luminance and paperwhite values are very different in the Virtua Fighter preset when compared to the Megatron Default preset.
In the VF preset I can’t go over 300 nits paperwhite, as this results in overblown highlights and clipping, as with the MT default preset I can go all the way up to 800 nits paperwhite, while still looking good.
Bottom line is, that the Megatron shader is very versatile and all presets work very differently. Also while reshade has a Megatron setting called “Neutral” which has the saturation slider at a value of 0, the Retroarch Megatron shader has no Neutral preset and only the Preset called Megatron Default, which I thought would be the same.
But when I go into the settings here, the saturation value is set @ 0.45 which oversaturates colors.
So the reshade Neutral preset and the Retroarch Default preset are not the same and these are some of the differences between the reshade and Retroarch ports and I think I would discover even more, if I dig deeper.
Also I compared Vivid and Original, but they measure almost the same.
I will check later with RWBG SP layout and RA internal HDR settings without shader active as you suggested.