Okay, I did this and at first I was wondering how I was going to compare output with scanlines and Mask enabled to without because obviously the Shader On would be darker but I did test both scenarios with the v5.6 and v5.7 colour_grade.h and the version 5.6 one is definitely closer to with the shader off. i.e. It’s considerably brighter overall and therefore it’s much closer to the brightness with the Shader Off compared to v5.7.
It doesn’t only affect the near blacks, even the Yellows, Blues and Reds seem darker in the game that I used for my latest test - Lords Of Thunder - TurboGrafx16 Super CD-ROM².
It’s probably not, because I just checked this and my driver is set to RGB 4:4:4 Full and my TV’s Black Level is set to Auto.
The Megatron Shader is also usable with Open GL and Vulkan API’s in HDR - if you own a Nvidia graphics card. I searched online for solutions and discovered this thread at the reddit forums:
In Reshade in the Megatron menu you have to leave the SDR/HDR setting in SDR, but it works in HDR, because windows converts everything now from SDR in HDR, regardless if reshade is used or not. And all colors look like they should look with correct color and gamma tonemapping - aaand HDR brightness
I think you can now run every CRT shader you like, for example from guest or CRT royale etc. with higher HDR brightness and also use the Megatron shader within Retroarch with the SDR/HDR setting set to SDR and it will work too in HDR.
Only the Megatron reshade shader from our man, MajorPainTheCactus is needed, the AutoHDR reshade plugin is not needed for HDR to work anymore!
When you start the Game, you even get a HDR message from Windows, stating that it’s activated.
I hope you will test it out, if you have a NVidia card - and post some feedback.
Also a little hint I discovered: In Windows 11 you can press the following buttons on your keyboard simultaneously to activate and deactivate HDR without going in to the settings menu:
This is the message I get now from Windows 11, when I start a game (sorry for german language, it somewhat says, that the game was improved with a brighter and more vivid image via Auto HDR ):
This is how it looks ingame (as far as my camera is capturing it) with the Megatron Reshade shader set to SDR running the Sega Supermodel Emulator and Daytona USA 2:
My camera somewhat makes the colors slightly duller than they really are, in reality it all looks punchy and vibrant like on a real CRT TV, without any exeggaration or oversaturating like the “HDR Vivid” setting in the Megatron shader does ever so slightly.
Basically the Swapchain setting in the Nvidia control panel activated Auto HDR for games on my PC and in the first place allows Open GL and Vulkan games via DirectX API to run also in AutoHDR, which was the missing part with Megatron reshade, as the AutoHDR reshade Plugin from MajorPainTheCactus does not work with Open GL and Vulkan at the moment.
Here is some discription from Microsoft regarding AutoHDR:
Originally this statement along with your enthusiasm about your discovery had me a little confused because I wasn’t sure if you were referring to the Sony Megatron Color Video Monitor shader in RetroArch as well because this has supported HDR in Vulkan for a very long time now.
You seem to have a really good emulation setup there, especially with those magic RGB phosphors seemingly working properly on an LG WRGB/W-OLED TV, which is actually a first at least for me.
Would it be possible for you to create a couple step by step guides using text, images and video or a combination of the 3 in a simple and clear manner that a baby can understand, showing users how they can set things up the way you have so that they can also share and experience what you might be experiencing?
I think this would be an immense contribution to the emulation community in my opinion.
As a matter of fact with this entire area of HDR implementation is probably one of the most significant milestones in CRT emulation development and history to take place in a very long time, yet who really knows about this appart from possibly a handful of us forum lurkers?
I would say that that is a tragedy.
I’m nervous. I hope I can get mine to look as good as yours. Lol
I believe you! I’m just imagining all the succulent CRTness you’re enjoying!
Thanks a lot for sharing this!
There was another user over on my preset pack thread a while ago who told me he used AutoHDR with other shaders. At the time I didn’t really think much of it. Just felt like AutoHDR might alter the colours but now you have me really interested in seeing it for myself.
After experiencing these HDR shaders it’s really hard to go back. It feels like a generational advance in CRT emulation.
I look forward to seeing universal support for HDR modes in all shaders in the future.
After I posted here this morning, I did some more tests with the PCSX2 and Redream emulators and they don’t work with the Windows 11 AutoHDR feature, but I think there are also workarounds for this, if you dig deep enough in the internet for solutions.
But so far I got some of my favorite emulators working with AutoHDR, which includes Teknoparrot and Sega Supermodel.
Here is a quick Step by Step guide. At the moment I am not home, so I can’t post videos or pictures, but I try to explain it as good as possible.
Windows 11 is mandatory and any other OS will not work as far as I know.
Make sure that your TV or Monitor is working in 4:4:4 chroma, so the shader will not look distorted. Look also in the control panel of the graphics card, that the signal is send in RGB
Check your video levels, the TV and PC should work in full range 0-255
Install your favourite emulator and set your video settings as wished in the emulator before installing reshade. Set the resolution output to borderless full screen in the emulator
Install the reshade Addon version and select the emulators exe file during installation. Then choose the API the emulator will use, i.e. DirectX, Vulkan or Open GL
Note, that Vulkan and Open GL will only work if you have a Nvidia graphics card with a current driver and set the Swapchain “hack” in the Nvidia Control panel, as described here:
This will allow the Vulkan and Open GL API to “work over” the DirectX API within windows and makes it possible to run the AutoHDR Windows 11 feature which normally only would work with DirectX11 and 12.
Run the emulator and look if the “AutoHDR” message from windows pops up and also look if the reshade menu in the left upper corner is activated
If everything went fine, close the emulator and insert the reshade files from MajorPainTheCactus into the reshade-shaders / Shaders folder:
Start the emulator again and the Megatron shader menu should be available in reshade now. There you can tune it to your liking. With Windows’ AutoHDR you have to set the shader itself to SDR, as Windows 11 now does all the tonemapping
I think these are the basics and it would be nice if you try it for yourself and post your opinion.
If you search on Youtube about the Windows 11 AutoHDR feature, you can see that it works remarkably well with many games:
Even if it’s a kind of “fake” HDR solution, because the games we use are not programmed with HDR in mind, I think it can look very convincing. With OLED’s I see no other solution, as they are just too dim in SDR for my taste with shaders that use 100% masks. Brighter LED, especially newer Mini LED TV’s and monitors should work in SDR too without all those workarounds. Maybe even the newer LG G3 OLED, which I want to buy next year when prices come down, may work in SDR, as it has much higher brightness than my current LG GX.
If there are any questions regarding the setup, I try to help.
Thank you so much for this, you’ve given us plenty to work with here. I’ll definitely give it a try at some point time permitting.
For Dreamcast you can use the FlyCast Core in RetroArch alongside the Sony Megatron Color Video Monitor Shader.
On another note, this is what my LG 55OLEDE6P Subpixel Layout looks like when displaying Full White using Aperture Grille Mask at 300TVL. Mask Accurate
Mode, No Deconvergence.
I’ve tried adjusting the Horizontal Deconvergence while using RGB Layout to see if I can get my triads to look more like your RGB example but I it doesn’t seem to have the ability to move the subpixels any nearer or closer to one another (which is understandable because you can’t light subpixels which aren’t there so how do you move red, blue or green phosphors horizontally in relation to one another.
Would it be possible for you to share a couple photos of your screen showing its subpixel layout under similar conditions?
I just want to see if there is a difference in the layout of the subpixels in your more modern panel.
I used Macro Mode with exposure set to the lowest with some basic mechanical stabilization for my pics.
I am also using Subixel RGB layout and it looks good on my LG GX.
As soon as I am home and have some time I will post some screenshots.
Regarding Flycast, I already tested the Retroarch Core as I posted earlier, but the Core has more graphical glitches than the standalone Flycast emu and also scales the games differently resulting in weird looking CRT masks.
I also have to try out Flycast with Windows AutoHDR again, maybe this will work.
if you use auto hdr, make sure to use the calibration tool to get accurate nits and white level.
“Windows HDR Calibration”. This will create a special HDR color profile that Windows will use for AutoHDR.
If you need to automatically turn HDR on/off, you can try AutoActions
Apologies been away. Yes ok so when you say this what are you referring to? Are you saying some process in the electron beam exciting the phosphors automatically gamma corrects? I was under the impression the power gamma is just a relatively simple way of making biasing the luminance to match the human brains interpretation of light so that a grey ramp appears evenly distributed to us rather than a linear luminance which doesn’t. Although I have no real idea I’d have imagined the evolutionary/biological reason is to do with the iris opening and it being a circle and so an exponential curve of light coming in as it opens. How does a CRT naturally adapt luminance in this way - I do realise different displays have different gamma but I thought that was to do with the internal circuitry being implemented differently?
As far as i’ve read (here for example), it was the cameras that included that correction circuitry. Which makes sense when you think about it.
Like, if all TVs were going to behave in the the same way in this regard anyway since they all used the same general techbase, why would you include that circuitry in the mass produced equipment at the very end of the chain when it would be cheaper and less complicated to include it in the more limited run specialist professional equipment earlier in the chain.
Yes, gamma is the distribution of black vs white, and the curve represents the difference between mathematical consistency (linear gamma) vs perceptual consistency (roughly 2.2). Higher and lower gamma values have been subjectively favored at different time periods and in different standards based on preferences at the time. CRTs (at least color ones, I guess) were designed around a higher gamma (2.4-2.5) presumably to produce richer colors. This is a good document about it: https://www.benq.com/en-us/knowledge-center/knowledge/gamma-monitor.html
Just a quick hint for anyone who uses the reshade port of Megatron.
The “CRT Height” setting is very important for correct scaling of the games.
I found out, that it should match the games native vertical resolution for proper scaling.
The internal resolution upscaling from 384p to for example 2160p doesn’t matter here, only the original resolution of the game is important.
For example the Sega model 3 games have a native resolution of 496x384 pixels.
For correct scaling, the CRT Height setting should be 384 instead of the default value of 240.
In Retroarch the CRT Height setting is missing, so I don’t know if there is a solution for this when you want to play games with a different resolution than 240p.
RetroArch’s shader system has access to the original frame size, so the shader can access that value automatically. Reshade doesn’t have that, so you have to set it manually.