Sony Megatron Colour Video Monitor

No this isnt what was fixed - we just changed the input gamma curve from a standards based one to a straight power curve. No tonemapping occurs in this shader only inverse tonemapping when HDR is used much further down the shader pipeline than at this point.

1 Like

Yes although not ideal I cant guarantee backwards compatibility. If things need to be fixed then they need to be fixed however in this case it is quite simple to go back. We should try and get to the bottom of what is correct in terms of original hardware and this point has been a bit contentious in the shader for some time as Ive swapped back and fourth between the two input gamma curves over time. Ive gone with the current option as it does keep the shader in darks closer to the colours used when not using shaders - Im yet to be completely convinced whether that is correct to original hardware both console, cable and tv. I hear your complaints though

1 Like

Yes I think this is just us running into discrete pixels and sharp fall offs. If we dont have integer scaling then every x number of lines youre going to have an extra one and with sharp fall offs and the slot mask it really highlights this. I certainly started using integer scaling when using certain slot mask presets Ive created. Whether I can do much about this Im not sure but certainly I might try to repro this and play around with the various attacks we have.

1 Like

Great stuff thanks for this! Ill take a look and see what youve done! Youre going to have to bear with me though as Im currently away and have a ton of work to do when I get back - its not easy finding time atm.

2 Likes

When I have the time I hope to expand the AutoHDR addon for ReShade. However is it needed on the windows 11? Cant you turn Auto HDR on in Windows and just use megatron reshade with that? I dont know I dont have windows 11. Failing that what are the problems with using retroarch for those systems?

2 Likes

Just to go a little more in depth with all this until you have time to check things out for yourself: at least based on lilium’s HDR analysis tools, the simulated phosphor primaries are located at the Rec 709 primaries with the color sytem set to 709 and Original/Vivid set to Original:

or at the Expanded 709 primaries when set to Vivid:

Changing the color system or phosphors makes the cie graph for Original look like this instead, regardless of which system you pick:

Which sort of looks like the shader is attempting to use the 709 primaries to break out of 709?

Alternatively, if we add a new “kNTSCJP22_to_2020” in hdr10.h generated using Dogway’s NTSC-J primaries, we get this instead:

Or, for kNTSC1953_to_2020:

Based on what comparisons i’ve been able to do so far, i think this is giving much more accurate results. Specifically, i think that RetroArch in SDR with Dogway’s Grade and the display both set to 2020 colorimetry looks more similar to the solution i am testing than it does to 5.7.

Unfortunately, i haven’t been able to do the sorts of direct comparisons that would really satisfy me yet, as i only have one WCG/HDR display, and as it turns out this issue applies to the Expand Gamut option in RetroArch as well, so i can’t currently run a second HDR instance of RetroArch with Dogway’s Grade set to Rec 2020 for alt tab comparisons, let alone do a proper side by side comparison.

Expand Gamut Off clamps colors to Rec 709:

and Expand Gamut On clamps to the Expanded709 space:

I have attempted to make my own testing fork of RetroArch with the primaries for the Expand Gamut setting adjusted, but the resulting build still locks to 709/e709 depending on the setting, so i am clearly missing something else in the code. I will keep at that for now and see if i can figure it out on my own, but if not assistance would be appreciated once you are less busy.

2 Likes

I’ve updated my preset pack taking the new Sony Megatron Colour Video Monitor v5.7 Gamma Curve into consideration. This update includes the first, simplified Decoupled CRT-Guest-Advanced-NTSC module provided by @GPDP1 plus integrated Super-XBR!

1 Like

In some cases I can use Windows 11 AutoHDR, but not always unfortunately. The PS1 Emulator Duckstation, the PS2 emu PCSX2 and the Dreamcast emu Redream won’t work, even with the Swapchain fix from Nvidia, which routes OpenGL and Vulkan over the DirectX API, I had no chance.

Teknoparrot and Sega Supermodel emu’s (both OpenGL) work fine with Windows 11 AutoHDR when the Swapchain fix is used and they look really good in conjunction with your Megatron shader.

The colors of Windows 11 AutoHDR look a bit better to my eyes when compared to your reshade AutoHDR plugin or Retroarch. The red primary color for example is looking a tiny bit more pure red and vibrant (not oversaturated) and the green just more pure green, while your AutoHDR plugin mixes in a tiny bit more yellow here. I am really talking about nuances, but it’s definately there and it doesn’t have anything to do with the HDR luminance values, which can mute and distort the hues of colors if set too low.

When I use the Windows 11 HDR calibration App, I get exactly 800 nits peak brightness and 800 nits paperwhite as a result. Very close to the settings I used before (700 / 600) in the Megatron shader menu within Retroarch and reshade with your AutoHDR plugin, as I wrote here before I discovered the W11 AutoHDR feature.

By the way, during the W11 HDR calibration I set my LG GX OLED to HGIG in the TV’s HDR tonemapping menu. The TV’s dynamic tonemapping should not be used during HDR dalibration, as the Peak and Paperwhite values will be false.

I also have an i1 Pro spectrophotometer and as soon as I am home I will measure the R,G and B primaries and post here the results via W11 AutoHDR and your AutoHDR reshade plugin and the embedded HDR within Retroarch. I am excited to see if my eyes are lying to me in regards to which colors are more close to the BT709 standard or let’s say more accurate in its hue - even if a bit oversaturated outside the BT709 coordinates :slightly_smiling_face:

If Windows 11 AutoHDR would work with all emulators, I would not mind using it and just stick with your reshade Magatron shader, as this combination is perfect to my eyes with the Aperture Grille mask. But well, it’s not that easy and your AutoHDR plugin would really help here, if it would work with Open GL and Vulkan too. I hope that it will be not too difficult for you to implement support, and yes time is always an issue too :pensive:

2 Likes

Good job on analyzing the gamma and color space conversions.

With regards to making changes to hdr.frag on the vulkan side. Before compiling RA, vulkan shaders need to be compiled to a precompiled SPIR-V file.

So for your changes in hdr.frag to take effect you need to do something like

glslc.exe hdr.frag -mfmt=c -o hdr.frag.inc

and have the updated hdr.frag.inc in your source (same folder as where hdr.frag is) before compiling RA.

Maybe this helps.

3 Likes

That fixed it! Ty for the assistance!

@MajorPainTheCactus

I am now quite certain that the current Colour System/Phosphors setting colors are being clipped and/or clamped and/or compressed to Rec 709 with Original or Expanded709 with Vivid.

It is plain as day when alt tabbing between multiple instances that adding more gamut options to the “HDR: Original/Vivid” setting results in colors that much more faithfully match Dogway’s Grade in Rec 2020 mode.

(This also confirmed my expectation that, when testing the Content Color Gamut settings in my testing fork, Mask Accurate/Colour Accurate should be set to Colour Accurate, Colour System to r709, and Phosphors to NONE.)

3 Likes

(yes Im still catching up sorry everyone) I presume this is an 8K TV? Looks great Id really like to see this on an 8K.

1 Like

Ah yes thats the solution I proposed yesterday which was after this post - sorry catching up! This looks fantastic - I love Daytona - I need to try this out ASAP on my QD-OLED!

1 Like

Ah I didnt know about opengl on d3d11 or vulkan on d3d11 - fantastic tip! Thanks!

1 Like

This is actually a GX so only 4K.

I’m just as perplexed as you @MajorPainTheCactus. :astonished:

Any takes on how @Dennis1 can seemingly get excellent RGB phosphor layout quality with a regular RWBG OLED subpixel layout panel @MajorPainTheCactus?

1 Like

Yes but we’re talking about consoles here that produce an image - this gets us back to what I was saying at the start: the signal output by the console is gamma corrected so that the TV can just pass it through and thus it must have been the console that did the correcting in its tv output circuitry/display chip. Certainly developers werent gamma correcting graphics/textures on a snes like they did on an xb360 say. So if that is true did all the console manufacturers ignore the tv standards - again it seems odd theyd do that.

Yes, but it is the emulators/cores that are doing the correcting/encoding in this context, whether they do so 100% accurately for Megatron’s purposes or not.

Megatron is the display.

So, ideally, i would say this is what we want to be happening gamma-wise for CRT-based consoles:

  • The “console” (emulator/core) uses a hardware accurate encoding gamma, whatever that may be.
  • The “display” (Megatron) uses a CRT accurate decoding gamma (pure power gamma in the 2.1-2.5 range.)

In reality, our “console” might not be using the correct gamma encode function, but we can’t change that by decoding the incorrectly encoded image using the inverse of the correct gamma encode function.

We could add an option to use a piecewise sRGB decoding gamma i suppose? Since that would presumably be the single most common “incorrect” encoding gamma produced by emulators/cores. As a bonus, this could also be useful for some more modern PC games that look their best with piecewise sRGB.

2 Likes

I’m not sure I agree with your analogy 100%.

I see the emulator/core as providing only the raw unencoded and unprocessed video source and Megatron or any other CRT Shader as having the responsibility of handling both the video output encoding stages and also the “display” functionality of simulating various CRT TVs.

In fact this is what all those NTSC filters/Shaders are about.

If this wasn’t the case and Megatron was just the “display” and everything else was done at the emulator/core level then Sega Genesis games would have blending and transparency just by “hooking up” the emulator/core to the “display”.

However this is not the case unless we add TV system filters to the output stage of the core or as shaders.

If what you say is correct then you should be able to use one preset with any emulator and have acceptable Gamma/Saturation.

In practice, if you take a preset that’s highly tuned for SNES Core output and you use it on an NES Core, it’s going to look very oversaturated.

Similarly, if you take that same SNES tuned preset and try to use it to play CPS1 games, it’s going to look a bit too bright and washed out.

So the shader has a role to play from even before the video signal leaves the box, all the way through the various TV inputs and finally for display on those virtual phosphors.

1 Like

Keep in mind that we are talking about gamma specifically in this case. Emulators/cores don’t necessarily simulate NTSC signal degradation, but they absolutely aren’t producing raw output with no gamma encoding whatsoever in at least most cases.

If they were, simply hooking a PC up to a CRT wouldn’t produce such good results.

2 Likes

CyberLab Megatron 4K HDR Game SNES Composite Smooth.slangp

1 Like

I just measured the primary colors with my i1 Pro 2 spectrophotometer and there are pretty big differences between Windows 11 Auto HDR and the built in HDR in the Megatron shader within Retroarch.

To compare, first I used W11 Auto HDR in conjunction with the Megatron Reshade shader (neutral preset) with color system set to r709 in the shader menu. My TV is set to color temperature warm1, which is still a lot cooler than 6500K when used without a shader (warm 2 measures also slightly above 7000K).

And this is what I measure with W11 AutoHDR:

With the Retroarch Megatron shader (default HDR preset) and its own HDR tonemapping and also color system set to r709 and same TV color temp I get this:

This is exactly what I saw with my eyes before measuring, there is too much yellow in the green with the Megatron HDR tonemapping.

Also what I noticed is, that whitepoint is too far off the black body curve towards magenta. The whitepoint with Windows 11 Auto HDR is much closer to the black body curve and should also be more accurate, as my TV is set to warm 1 and should be close to 9000K.

The whitepoint with W11 Auto HDR measures around 9800K and with Megatron HDR around 6600K. I think, that 6600K is too warm for my LG OLED with color temperature set to warm 1.

Sure, the Windows 11 AutoHDR oversaturates colors more beyond Rec709, which is also not accurate, but in summary, to me the tonemapping looks just better and coherent with more pure green without shifting to yellow and a whitepoint that’s much closer to the color temperature set by the TV (when measured without the shader) and also much closer to the black body curve without the magenta shift, which otherwise gives the picture a pinkish tint, which I noticed right from the beginning using the Megatron shader with its own HDR TM.

I would love to see, that MajorPainTheCactus also compares Windows 11 Auto HDR with the Megatron HDR tonemapping, if time allows and give us some feedback what he thinks about it :smiling_face_with_three_hearts: :grin:

2 Likes