Tem como disponibilizar esse shader ?
I still can’t believe how similar the picture on the OLED with 8K @ 800 TVL setting and the CRT looks with content above 240p without scanlines (i.e. PS2, Dreamcast, Gamecube etc.) 
The slotmask appears a bit coarser on a 55 inch TV in comparison to a 27 inch CRT, but it is not very noticable yet. I would’t go any lower with the TVL setting though, as this is the treshold in my opinion.
CRT:
OLED:
CRT close:
OLED close:
Makes sense if you’re using the 8K setting like that on 4k, because it results in a 6 pixel mask = 360 TVL.
Actual CRT is probably like ~450. Because there is no inbetween, you end up with the mask being a bit coarser or finer (540 TVL).
Your calculation seems right Jamirus.
I would also sometimes like to use the 1000 TVL (540 TVL) setting of Megatron, but the disadvantage is, that it gets quiet a bit darker and somehow colors look different / not as correct as with 800 TVL (360 TVL).
Also it depends on the content. Everything below native 480p looks better with coarser masks in my opinion. Even if I would own a high TVL Sony PVM or BVM, I would not play anything below 480p on these I think.
Yes, it really is impressive!
The most glaring differences I can see between the two are that there’s a difference in Gamma, with the CRT being darker. Not sure if there are other colour/tint/whitepoint/phosphor/colourspace differences as well which will be seen once Gamma is matched/corrected.
Also, the shader seems to be blurring some of the fine details in the textures. Perhaps a more conservative application of blur might yield even closer results!
CRT
OLED
CRT
CRT
OLED
CRT
OLED
The scanlines and sharpness can look very distracting, on my old PC monitor (probably like 1000 TVL), it was helpful to have an extra control to regulate sharpness.
Part of the impression is down to the mask type too, really high TVL slot masks (like 700+) are just very untypical, there aren’t a lot of photos out there. Although the monitors aren’t that rare, if I had the room I probably could get a monitor in a rather short time span.
This is because the CRT has a slight “black crush” and I measured a Gamma of around 2.6 at the low end of the greyscale, which after 20 IRE flattens out to about 2.2. If I raise the brightness setting on the CRT and therefore Gamma a bit, the blacklevel gets destroyed and becomes a bit greyish. I am sure, if I play around with the flyback pots and other pots on the CRT chassis, I can dial that little “problem” in.
I also set up Gamma in Lilium’s inverse tonmapping shader from 2.2 to sRGB, as with 2.2 the OLED is still slightly too dark (black crush) in the shadows.
The CRT otherwise is calibrated as good as possible from my side to Rec709 and the Delta Errors of the greyscale and color checker measurements etc. are very low. Beside the difference in Gamma, colorwise the OLED and CRT look very similar, if not identical.
I already dialed in the blur shader as low as possible and if I go any lower (or sharper), the coarseness ot the Slotmask and some color fringing of it becomes too apparant for my eyes. Also these are pictures from my phone and they can vary in sharpness from picture to picture and the focus is also never perfectly even across a single picture. The difference is really negligible to non existent from a viewing distance.
Yep, high TVL monitors were not made for lowres content. But some people like 240p on BVM’s and that is okay as everybody has a different taste. I read somewhere, that it looks like an LCD watched through venetian blinds 
True, they almost always were Dotmask or Aperture Grille as far as I know.
For lowres content Slotmasks look best to me. With the Megatron shader it depends. The slotmask with scanlines looks different than without scanlines (i.e. 2160 CRT_Height setting). It could have a bit more detail with scanline content, but that is because my TV is only 4K and not 8K.
With an 8K display I think “scanlined slotmasks” will look perfect with the shader. Currently I use the Aperture Grille for 2D stuff due to this reason, as it does not need the same detail as a slotmask.
I think these were Slot Mask, not sure if the TVL was 700+
https://www.reddit.com/r/Monitors/comments/141sa9f/guide_improving_hdr_fidelity_on_windows_11/
@Dennis1, take a look at this:
Feel free to give it a try if you’re interested.
When using this with RetroArch, do I need to enable HDR in the settings? I’ve tried to use it, the screen looks like the gamma is too high… My TV is a LG C1. I Double-checked the configurations.
Definitely not. This method used Reshade exclusively for the shader effects and SDR to HDR Tonemapping. You shouldn’t even have Video Shaders enabled in RetroArch. So you want RetroArch to be in regular SDR mode and also the Sony Megatron Color Video Monitor Shader in Reshade.
Feel free to ask more questions if you’re stuck again.
Cyber is right, in Retroarch HDR should not be enabled. Lilium’s inverse tonemapping takes care of this. In Retroarch and Reshade Megatron everything should be configured as SDR. Only make sure, that in Windows HDR is enabled and your display receives a HDR signal from the PC.
how are you having this working in duckstation, I was getting nowhere trying to get the hdr addon (either the original or lilium fork) to kick in in reshade on anything other than old pc games using dgvoodoo. no stand alone emulators. the one exception: in pcsx2, the lilium inverse tonemapping shader was recognizing hdr in the picture, even though the hdr addon wasn’t working, giving me the same old “turn on hdr if your monitor supports it” message. I assume with dosbox, xemu, and scummvm my problem was opengl, but dolphin, pcsx2 and duckstation are set to vulkan and it’s the same thing. trying the nvidia dxgi swapchain thing didn’t change anything except it stopped making the screen dim after quitting until I turned hdr off and back on.
This is how I set everything up and it works for me:
I also get the error message in Reshade at the addon tab, that HDR support is not enabled, but it works nonetheless and HDR is activated from Windows side and the TV running in HDR.
And it works the same way in Duckstation and PCSX2 if I choose DirectX 11 or 12 instead of Vulkan and install Reshade with the correct API. DXGI Swapchain is also not neccessary for me.
It is important to install “Reshade with full add-on support” (not the “normal” version) and then during installation select Lilium’s AutoHDR add-on and the inverse tonemapping shader, but i guess you did this correctly.
In some emulators like Flycast for example, I have to manually override the “CSP_Override” setting like in this picture to get it to work:
As you may notice, the screenshots are a bit blown out, but that is because Lilium’s inverse tm is doing its thing and I have not found a way to accurately capture screenshots with HDR and correctly present them here.
Thank you, will try it out!
You’re welcome! I updated the post with a couple more tools.
The first issue I have is, that I can’t save screenshots in the .JXR format. I tried it with the Windows 11 screenshot capture function and also Nvidia Experience Screenshot utility. They both just save pictures in .png format. without any HDR information. Retroarch just crashes when I add it to Special K…
Also Lilium’s tonemapping is based on HDR SCRGB instead of HDR PQ, which may also plays a role here and I am not sure, if capturing inverse tonemapped pictures is done the same way as normal HDR pictures.
But I have a very simple solution for all of us. I just capture screenshots in SDR without HDR tonemapping and then if I want to view them with full brightness, I just put my display in HDR mode and then drag the Windows SDR brightness slider to maximum:
This way pictures look pretty much identical and bright, just as if I play them with Lilium’s inverse tonemapping.
Here is an example:
Put your display in HDR mode and SDR slider to maximum and it will look correct with only minor differences. Best is to download the picture and view it in fullscreen, as it gets even brighter then, for me at least.
















