I played around with the updated shaders a lot this weekend and agree, something appears to be off with the Shadow Mask one on my LG OLED as well.
I wonder if the mask looking “off” is related to what Jamirus and I mentioned above. I described it as looking misaligned, but fixed it by enabling Integer Scale Overscale (not an ideal fix I know). Here’s an example of slot mask in a Sierra AGI game in scummvm with aspect correct enabled (4:3):
This does not look correct at all…and it shouldn’t because you’re not using the proper mask layout for your LG OLED TV.
I think I use Integer Scale and Integer Scale OverScale quite a bit and I don’t think that has any bearing on the issue I experienced. I can recheck it though.
I already showed that it’s something that’s happening when I employ Deconvergence.
So. I have good news, bad news, and more good news:
First bit of good news: i think i’ve sourced a more objective and satisfying solution to the gamma problem. Page 3 of Recommendation ITU-R BT.1886 (03/2011) includes what is referred to as a “CRT matching” EOTF. Unfortunately, i don’t have a solid grasp on how to implement this in slang, so i can’t really test it on my own before suggesting it. Thoughts and/or assistance anyone?
The bad news: I got lilium’s HDR analysis Reshade tools running with Retroarch, and i’m reasonably certain that the color system/phosphor conversions are currently happening entirely in Rec709/sRGB space, and thus are effectively being clipped to Rec709, at least if “HDR: Original/Vivid” is set to Original. (As you thought, Vivid instead locks/pushes everything to a non-standard “Expanded Rec709” gamut suggested by Microsoft.)
Second bit of (mostly) good news: i’ve already prototyped what seems to be a working solution, but it will require a fair bit of overhauling.
Basically, the function “HDR: Original/Vivid” is tied to in hdr10.h is already set up to be turned into a Color System replacement. We just need to add more transforms to Rec2020 in addition to k709_to_2020 and kExpanded709_to_2020, which can be calculated/generated using https://www.colour-science.org:8010/apps/rgb_colourspace_transformation_matrix or http://color.support/colorspacecalculator.html
For example, kNTSCJP22_to_2020 would be
0.691014f, 0.261287f, 0.047700f,
0.109646f, 0.876813f, 0.013541f,
0.014382f, 0.096629f, 0.888989);
Or, for an option that pushes well outside Rec709, kNTSC1953_to_2020 would be
0.924130283352921f, 0.058040878058827f, 0.017828838588251f,
0.081832452209371f, 0.839996416103710f, 0.078171131686919f,
-0.003138728266940f, 0.033216715568834f, 0.969922012698106f);
This method looks much better by eye and in the analysis tools, but i’m not 100% sure it it is currently doing everything we might want in conjunction with the “Colour Accurate” option? (I think we also want to lock the current color system to 709? But i’m still testing.)
I’ll continue testing and converting the remaining gamuts for now.
I had also already been working on assembling some additional phosphor options for 60s and 70s TVs, so this will be a good time to add those assuming the UI has enough space.
Oh, and Special K can also take .jxr HDR screenshots of Retroarch with HDR shaders included. No idea where to host those off the top of my head tho.
So what was that fix in the last PR that made all of my presets very dark all about then?
How would you rate that fix?
IMHO if working on something so critical to a vast multitude of users, perhaps there should be some Beta/Alpha hosting somewhere and a little more thorough testing and consensus before a PR is made to the main branch of RetroArch.
To me these aren’t things that should be interfered with willy-nilly unless and until a better solution is confirmed to exist.
Also, it would be nice if there was some way or some formula to “get back” for those who were satisfied with their settings to not have to go about manually tweaking again.
That was a fix for a long-standing bug that was apparently causing tonemapping to be applied twice, resulting in severe artifacting/boosted brightness in darker/near-black colors. It just may not have been happening on your system for some reason we never nailed down.
The part of the shader code I am suggesting a potential alternative option for wasn’t changed in 5.7, and i agree that the current code should probably remain an option unless an alternative ends up being universally preferred, since it is clearly working well as is.
5.7 shouldn’t have made things darker in general tho… can you try reverting just colour_grade.h to the version from 5.6 to verify there isn’t anything else going on?
Also, were your settings used for those sgng pictures somehow darkened compared to the defaults in your preset pack? Because the overall brightness of those presets looks identical to 5.6 on my system, just without the weird artifacting/boosted brightness in dark colors.
If it was happening, I might have just increased my Gamma settings until it looked right to me - which is what I actually did.
So after this latest update, my presets, on which I increased the Gamma so far and tested all seemed much darker.
.
I can check it but if testing a dark game, then it might appear darker in general but I wasn’t really differentiating between general darkness and darkness in the near blacks in my statement concerning the matter.
I can’t say for sure as I’ve already moved on from the settings in the first release of my preset pack.
As a matter of fact, I’ve already started adjusting my presets to take the new changes into consideration. I have no choice actually because what was once satisfactory in the games that I had tested so far were now unsatisfactory but if there was a bug that needed fixing that I may have already tried to compensate for by turning some knobs, I’ll just have to compensate for the new behavior.
It’s cool. I’ve been there before.
I still find near blacks pretty dark and now have the Gamma lowered to around less than 2.2.
Previously I had raised the Gamma to 2.45+.
I’m AFK now so that’s the most accurate account I can give you currently.
My issue is the alignment of the scanlines. The mask makes no difference. From a distance it looks like there’s a squiggly line every 2 or 3 scanlines. Perhaps my photo doesn’t convey it very well, but thought it might be similar to what you described as “scanline gaps seem to be making a square/pulse wave-like up and down pattern instead of being in a straight line”.
Not sure if this is the same issue rancid is having when using slot mask, but I think it’s the same as Jamirus’ issue discussed further up.
Here’s a photo with and without integer scale overscale enabled (using the correct mask for OLED):
I’ve polished up the prototype gamut fix/improvement a bit and committed it to a fork for testing:
https://github.com/Azurfel/slang-shaders/tree/master/hdr/shaders/include
Only hdr10.h and parameters.h have been changed thus far.
The setting is currently called HDR: Content Color Gamut, replacing HDR: Original/Vivid right under peak luminance and paper white.
Current options are: 0:Rec 709 (previously Original)|1:Expanded Rec 709 (previously Vivid)|2:NTSC 1953|3:Rec 601|4:PAL|5:P22-J|6:P22-80|7:P22-90|8:RPTV|9:Display P3|10:Rec 2020
That ordering and the exact composition are very much preliminary, and the menu option in general is a rather messy WIP. I very much wanted to ensure that that following info was front and center during testing even tho all that extra text makes a bit of a mess of things:
I would strongly recommend setting “Mask Accurate/Colour Accurate” to Colour Accurate, as the fully expanded gamuts further exacerbate Mask Accurate’s issues.
Based on comparisons using a dummy color gamut that fits entirely within rec 709, Colour System should be set to r709, and Phosphors to NONE with these settings.
I just want to say, that I am glad that this shader is getting better and better through you guys, especially with correct color conversion and gamma mapping, as you are doing at the moment. Thank you so much.
In my opinion this is the most authentic shader available today and that it’s very well worth every improvement. It’s just sad, that the reshade port doesn’t seem to work with DirectX9, Open GL and Vulkan, which many Standalone emulators use.
For example, the excellent Redream Dreamcast emulator is based on Open GL and therefore I can’t run it with the shader in HDR. Also the standalone PCSX2 emulator does crash with AutoHDR, despite being configured with DirectX11/12.
I hope, that there is the possibility that the Megatron Shader with AutoHDR will support all graphic API’s in the future. Otherwise my only choice might be to get the new Retrotink 4K scaler with implemented 4K CRT HDR masks, which is independent from software compatibility issues and can just be hooked up between my OLED and PC. The only problem is, that this will be expensive…
Have you tried Flycast? I know that has a DX11 option (or at least it did last i used it).
Yes, I tried the standalone Flycast with reshade, but it also crashes with the AutoHDR files and always resets to OpenGL in the menu. Only if I delete the AutoHDR files in the Flycast main folder, it works with the shader in SDR - with low brightness.
On the other hand, the Flycast Core in Retroarch has graphical glitches and also the 4K CRT mask is not accurate, because the internal resolution is somehow used differently in RetroArch in comparison with the standalone Emulator, as it seems to scale the resolution also externally, which the Megatron shader doesn’t like and doesn’t apply the mask correctly.
The only standalone emulators which I could get to run perfect with reshade and HDR is the PS1 emulator Duckstation and Mame. The Mame standalone works correctly in HDR if I choose the video option “bgfx” under the Mame settings, which should be based on the DX11 or 12 API.
This is how the mask looks with Flycast within Retroarch (please watch in Fullscreen):
The internal resolution is native @ 640x480 and as you can see the CRT mask just looks wrong.
And this is how the mask looks correctly within Duckstation, even with very high internal resolution above 4K:
Sorry to post these pictures with the mask set in SDR, as I have not yet found a way to save screenshots correctly in HDR and view them properly. If someone has a suggestion, that would be very nice, as it would make it a lot easier to post captured screenshots here with higher brightness etc. as they would look when the game is running in HDR, without using a smartphone.
This is how the game looks in the standalone Flycast emulator, with correct mask in comparison to Retroarch. But unfortunately without HDR, as it crashes and resets to OpenGL.
And this is why the shader is looking so good:
From a closeup, you can see, that it faithfully resembles an Aperture Grille mask. Just a pure phosphor mask without any of additional bulk processing, which for me personal is not needed and also keeps the shader lightweight and not hardware demanding.
And HDR finally makes it possible to have sufficient brightness too.
It’s really just compatibility, which at the moment is lacking with the missing API’s and the occuring crashes I experienced. Beside this and maybe some finetuning of the Color and Gamma mapping mentioned above, the shader is probably as good as it will ever get in regards to mimicking a CRT.
I think it can even surpass CRT’s in some aspects, as OLED’s overall have no convergence issues and better black level, just to mention a few things.
I really like the way this mask is looks. What TV/display do you use if you don’t mind me asking and what are your Peak Luminance and Paperwhite settings?
This is just a “Zoom-In” from the screenshot I posted above from the Flycast Emulator running Soul Calibur. I use a LG GX OLED TV and currently I am running Peak Luminance @ 1000 and Paperwhite @ 700.
Here are some close pictures I took with my smartphone from the same scene off the TV. The first is with RGB Subpixel Layout and the second with RWBG Subpixel Layout:
It seems, that both Subpixel Layouts work well with WRGB OLED TV’s, so I will stick with RGB.
Also I want to mention, that the Red, Green and Blue Deconvergence settings in the shader allow for very precise fine tuning of the phosphors, which is a very nice addition to the shader.
It can also help to eliminate “Rainbow” artifacts when games with slightly different resolutions are being played. I experienced this for example with Mortal Kombat, which has a slightly different resolution than 240p or 224p. Here the Deconvergence settings have to be used for a clean picture.
So the shader looks better in the standalone emulator with reshade than in retroarch? I had problems dialling in the correct resolution when using crt shaders in reshade, but haven’t tried in a while. Has it got easier? I’m surprised by the rgb mask on your oled, looks good. Does your tv actually reach 1000 nits or is that just the setting you preferred the look of from the shader? As a side note, I’ve also been watching the retrotink4k progress with interest. Tempted to use it for connecting an Amiga to my oled, but as you say it’s quite an expense!
This has been proven to not generally be the case at least with LG OLED TV models/shader preset settings prior to yours.
If you don’t mind, can you take some stabilized pics of something white or grey showing the RGB phosphor triads using RGB layout and OLED layout, both with and without Deconvergence using the Mask Accurate Mode (feel free to include Colour Accurate as well if you wish) please?
I just want to see if there has been a positive change in the layout or panel technology for our use case.
This is correct, I more use vertical Deconvergence to attempt to mimic the slight imperfections in convergence that a real CRT might have but I haven’t been using it in a more intricate way recently.
Are you saying that the tweaked Deconvergence is what might also be contributing to that near perfect RGB layout in those pics you shared?
I really have to ask again…just for confirmation…not doubting you…
This is from your LG GX OLED TV using RGB Layout in Sony Megatron Color Video Monitor using tweaked Deconvergence settings?
Would you mind sharing your preset or at least your Deconvergence settings so that others can try them?
@hunterk, @MajorPainTheCactus, @guest.r, @Nesguy look at this thing of beauty! Can you believe this is WRGB/W-OLED?
Okay so I just did this and it looks much better. Just the way it did when I made my presets.
The only difference between colour_grade.h in 5.6 vs 5.7 is that this:
v5.6
const vec3 linear = r601r709ToLinear(white_point); //pow(white_point, vec3(HCRT_GAMMA_IN));
…was changed to this:
v5.7
const vec3 linear = pow(white_point, vec3(HCRT_GAMMA_IN)); // r601r709ToLinear(white_point);
I’m not a coder so I don’t know what it does but it looked and worked much better before. Unless this is really how things are supposed to be and need to be for whatever reason, can I kindly ask that this be reverted in the main RetroArch repo please?
That would be the bugfix. The version in 4.2-5.6 was causing the darkest colors to be brighter than they should have been relative to brighter colors, at least on my, @MajorPainTheCactus’, and @Wilch’s systems. MajorPainTheCactus thought it was double tonemapping.
If you set Megatron’s brightness, saturation, and contrast to 0, gamma to 2.22, enable Colour Accurate mode, plug the same peak luminance and paper white settings you use for Megatron into Retroarch’s stock HDR menu, and compare Megatron to disabled shaders, does the output without shaders look more similar to 5.6 or 5.7?
Edit: It’s probably not this, but can you double check if you somehow ended up with “Output dynamic range” set to Full in NVCP, but your display’s Video Range/Black Level set to Limited/Low? Because as it turns out, that more or less completely covers up the bug in 4.2-5.6 and crushes blacks in 5.7.