Ok my misunderstanding, got it This might solve some issues I had for example with the Flycast Core, which scales different than the standalone emulator and produced weird looking Megatron masks.
And this is necessary to get certain shaders to play well with others which expect Core Resolution but might have been subject to some upscaling/resampling like if one is trying to integrate or prepend Super-XBR.
Interestingly, I did this earlier today just by observation and trial and error but it would have been much easier if RetroArch had an included preset with the scaling setup this way to make it easier especially for new users to be able to access.
Color Gamuts i am currently planning to include or considering including in the next update to my gamut fix testing fork, along with justification:
Gamut list
NTSC 1953: The OG color system that was only really used for like 5-8ish years back when basically no one owned a color TV anyway. If you are Brazillian or from a SECAM region, it may also match some old CRT TVs you’ve used with really weirdly intense greens? Hard to say. This sort of thing is kind of underdocumented.
RCA 1958 (?1961?): Millennial’s grandparent’s old TV with weird colors #1.
RCA 1964: Millennial’s grandparent’s old TV with weird colors #2.
SMPTE C/Rec 601-525 line/Conrac: Baseline standard gamut for Analog NTSC.
PAL/Rec 601-625 line: Baseline standard gamut for Analog PAL.
Dogway’s NTSC-J: Baseline standard gamut for Analog NTSC-J.
P22_80s: Dogway’s Grade gamut for 1980s-early 1990s TVs.
Apple RGB/Trinitron PC: Should approximate basically any Trinitron monitor from 1987-the mid to late 1990s. (By the early 00s, they were SMPTE C instead, at least for high end monitors like the FW900.)
P22_90s: Dogway’s Grade gamut for mid 1990s TVs with tinted phosphors.
RPTV_95s: Dogway’s Grade gamut for late 90s/early 00s rear projection TVs that game manuals said you shouldn’t play games on due to burn in risk.
Rec 709/sRGB: SDR HDTV/Windows gamut. (This is currently the Original setting as of 5.7.)
Expanded 709: Non-standard gamut someone at ?Microsoft? cooked up because they thought Rec 709 color looked desaturated when converted into HDR. (This is currently the Vivid setting as of 5.7)
Display P3/P3-D65: Common wide color gamut. Variant on the gamut used for film with shared primaries. Might be useful in the future if someone makes a WCG pixel game that looks best with a CRT shader?
Rec 2020: HDR gamut. Again, might be useful in the future if someone makes a WCG pixel game that looks best with a CRT shader.
and under consideration:
ColorMatch RGB: Matches Radius’ PressView calibrated pro displays released starting in 1994. I guess this could be the best fit for some PC and especially Mac games from the mid-90s?
Adobe RGB: In case someone accidentally made (or makes) a relevant game using an Adobe RGB display? I don’t think i’ve ever heard of that happening, mind…
Adobe Wide Gamut RGB: I doubt anyone would ever accidentally make a game using this. I’m not sure there even are monitors that support this out of the box.
(I have phosphor primaries for an additional variant each on NTSC 1953 and Rec 601, but in practice they look so similar to the actual standards that it feels redundant to include them. I’m also still researching a three other P22 phosphor set variants that may or may not be worth including. We’ll see if my local library can get me access to “Optical Characteristics of Cathode-Ray Tube Screens”.)
Are there any other color gamuts anyone would like to see included? I think this pretty well covers those that could be useful, but it’s possible i overlooked something.
What about this issue?
Are you going to revert to the v5.6 colour_grade.h?
You know what, forget about this. I not have confirmed and acknowledged that there actually was a problem and the update to v5.7 really has solved the issue.
I’m just going to have to update my presets now to take the fixed gamma into consideration.
Besides the Donkey Kong Country example you previously sited, another perfect example of the issue is in SF2CEUC, which is a game that I previously used the grey box around the yellow text in the FBI warning as a Gamma adjustment test.
Before v5.7 I could never get the grey box to completely blend into the background without completely darkening everything else and causing everything literally. With v5.7, I can now make these adjustments properly.
So thanks for the great work! @MajorPainTheCactus, @Azurfel.
240p is not surprising, it would be more interesting if you set the height to an even divider of 2160p when upscaling like 432p and see if it makes notable differences. Perhaps results there may be depend on the emulator/settings. Aspect ratio correction for example is handled differently in RA than some others emulators like PCSX2. E.g for 4:3, I’ve noticed the latter expands the height for e.g. 640x448, RA changes pixels in the horizontal direction.
For the purposes of aligning masks, it is the content resolution that matters, accounting for any padding or cropping in the framebuffer. Proper integer scaling is a separate issue that needs to be handled elsewise.
Semi-relatedly, you do not want to use the cropping overscan options built into RetroArch cores such as Mesen. Use shaders_slang\misc\image-adjustment.slangp or something similar that masks off the overscan instead.
Ok so youre going to have expand on this a bit as Im a bit lost as to where you think the clipping is occurring. HDR is rec. 2020 and we’re converting from rec 601/709 colour space to the intermediate XYZ space and then we transform out to rec 2020 or expanded rec 2020 space. This is all done with floats inside the shaders and stored out and read in to intermediate float 16 buffers. At no point any clipping should happen as far as I can see. Where do you think the clipping is happening?
No this isnt what was fixed - we just changed the input gamma curve from a standards based one to a straight power curve. No tonemapping occurs in this shader only inverse tonemapping when HDR is used much further down the shader pipeline than at this point.
Yes although not ideal I cant guarantee backwards compatibility. If things need to be fixed then they need to be fixed however in this case it is quite simple to go back. We should try and get to the bottom of what is correct in terms of original hardware and this point has been a bit contentious in the shader for some time as Ive swapped back and fourth between the two input gamma curves over time. Ive gone with the current option as it does keep the shader in darks closer to the colours used when not using shaders - Im yet to be completely convinced whether that is correct to original hardware both console, cable and tv. I hear your complaints though
Yes I think this is just us running into discrete pixels and sharp fall offs. If we dont have integer scaling then every x number of lines youre going to have an extra one and with sharp fall offs and the slot mask it really highlights this. I certainly started using integer scaling when using certain slot mask presets Ive created. Whether I can do much about this Im not sure but certainly I might try to repro this and play around with the various attacks we have.
Great stuff thanks for this! Ill take a look and see what youve done! Youre going to have to bear with me though as Im currently away and have a ton of work to do when I get back - its not easy finding time atm.
When I have the time I hope to expand the AutoHDR addon for ReShade. However is it needed on the windows 11? Cant you turn Auto HDR on in Windows and just use megatron reshade with that? I dont know I dont have windows 11. Failing that what are the problems with using retroarch for those systems?
Just to go a little more in depth with all this until you have time to check things out for yourself: at least based on lilium’s HDR analysis tools, the simulated phosphor primaries are located at the Rec 709 primaries with the color sytem set to 709 and Original/Vivid set to Original:
or at the Expanded 709 primaries when set to Vivid:
Changing the color system or phosphors makes the cie graph for Original look like this instead, regardless of which system you pick:
Which sort of looks like the shader is attempting to use the 709 primaries to break out of 709?
Alternatively, if we add a new “kNTSCJP22_to_2020” in hdr10.h generated using Dogway’s NTSC-J primaries, we get this instead:
Or, for kNTSC1953_to_2020:
Based on what comparisons i’ve been able to do so far, i think this is giving much more accurate results. Specifically, i think that RetroArch in SDR with Dogway’s Grade and the display both set to 2020 colorimetry looks more similar to the solution i am testing than it does to 5.7.
Unfortunately, i haven’t been able to do the sorts of direct comparisons that would really satisfy me yet, as i only have one WCG/HDR display, and as it turns out this issue applies to the Expand Gamut option in RetroArch as well, so i can’t currently run a second HDR instance of RetroArch with Dogway’s Grade set to Rec 2020 for alt tab comparisons, let alone do a proper side by side comparison.
Expand Gamut Off clamps colors to Rec 709:
and Expand Gamut On clamps to the Expanded709 space:
I have attempted to make my own testing fork of RetroArch with the primaries for the Expand Gamut setting adjusted, but the resulting build still locks to 709/e709 depending on the setting, so i am clearly missing something else in the code. I will keep at that for now and see if i can figure it out on my own, but if not assistance would be appreciated once you are less busy.
I’ve updated my preset pack taking the new Sony Megatron Colour Video Monitor v5.7 Gamma Curve into consideration. This update includes the first, simplified Decoupled CRT-Guest-Advanced-NTSC module provided by @GPDP1 plus integrated Super-XBR!
In some cases I can use Windows 11 AutoHDR, but not always unfortunately. The PS1 Emulator Duckstation, the PS2 emu PCSX2 and the Dreamcast emu Redream won’t work, even with the Swapchain fix from Nvidia, which routes OpenGL and Vulkan over the DirectX API, I had no chance.
Teknoparrot and Sega Supermodel emu’s (both OpenGL) work fine with Windows 11 AutoHDR when the Swapchain fix is used and they look really good in conjunction with your Megatron shader.
The colors of Windows 11 AutoHDR look a bit better to my eyes when compared to your reshade AutoHDR plugin or Retroarch. The red primary color for example is looking a tiny bit more pure red and vibrant (not oversaturated) and the green just more pure green, while your AutoHDR plugin mixes in a tiny bit more yellow here. I am really talking about nuances, but it’s definately there and it doesn’t have anything to do with the HDR luminance values, which can mute and distort the hues of colors if set too low.
When I use the Windows 11 HDR calibration App, I get exactly 800 nits peak brightness and 800 nits paperwhite as a result. Very close to the settings I used before (700 / 600) in the Megatron shader menu within Retroarch and reshade with your AutoHDR plugin, as I wrote here before I discovered the W11 AutoHDR feature.
By the way, during the W11 HDR calibration I set my LG GX OLED to HGIG in the TV’s HDR tonemapping menu. The TV’s dynamic tonemapping should not be used during HDR dalibration, as the Peak and Paperwhite values will be false.
I also have an i1 Pro spectrophotometer and as soon as I am home I will measure the R,G and B primaries and post here the results via W11 AutoHDR and your AutoHDR reshade plugin and the embedded HDR within Retroarch. I am excited to see if my eyes are lying to me in regards to which colors are more close to the BT709 standard or let’s say more accurate in its hue - even if a bit oversaturated outside the BT709 coordinates
If Windows 11 AutoHDR would work with all emulators, I would not mind using it and just stick with your reshade Magatron shader, as this combination is perfect to my eyes with the Aperture Grille mask. But well, it’s not that easy and your AutoHDR plugin would really help here, if it would work with Open GL and Vulkan too. I hope that it will be not too difficult for you to implement support, and yes time is always an issue too
Good job on analyzing the gamma and color space conversions.
With regards to making changes to hdr.frag on the vulkan side. Before compiling RA, vulkan shaders need to be compiled to a precompiled SPIR-V file.
So for your changes in hdr.frag to take effect you need to do something like
glslc.exe hdr.frag -mfmt=c -o hdr.frag.inc
and have the updated hdr.frag.inc in your source (same folder as where hdr.frag is) before compiling RA.
Maybe this helps.
That fixed it! Ty for the assistance!
I am now quite certain that the current Colour System/Phosphors setting colors are being clipped and/or clamped and/or compressed to Rec 709 with Original or Expanded709 with Vivid.
It is plain as day when alt tabbing between multiple instances that adding more gamut options to the “HDR: Original/Vivid” setting results in colors that much more faithfully match Dogway’s Grade in Rec 2020 mode.
(This also confirmed my expectation that, when testing the Content Color Gamut settings in my testing fork, Mask Accurate/Colour Accurate should be set to Colour Accurate, Colour System to r709, and Phosphors to NONE.)
(yes Im still catching up sorry everyone) I presume this is an 8K TV? Looks great Id really like to see this on an 8K.
Ah yes thats the solution I proposed yesterday which was after this post - sorry catching up! This looks fantastic - I love Daytona - I need to try this out ASAP on my QD-OLED!
Ah I didnt know about opengl on d3d11 or vulkan on d3d11 - fantastic tip! Thanks!