When using this with RetroArch, do I need to enable HDR in the settings? I’ve tried to use it, the screen looks like the gamma is too high… My TV is a LG C1. I Double-checked the configurations.
Definitely not. This method used Reshade exclusively for the shader effects and SDR to HDR Tonemapping. You shouldn’t even have Video Shaders enabled in RetroArch. So you want RetroArch to be in regular SDR mode and also the Sony Megatron Color Video Monitor Shader in Reshade.
Feel free to ask more questions if you’re stuck again.
Cyber is right, in Retroarch HDR should not be enabled. Lilium’s inverse tonemapping takes care of this. In Retroarch and Reshade Megatron everything should be configured as SDR. Only make sure, that in Windows HDR is enabled and your display receives a HDR signal from the PC.
how are you having this working in duckstation, I was getting nowhere trying to get the hdr addon (either the original or lilium fork) to kick in in reshade on anything other than old pc games using dgvoodoo. no stand alone emulators. the one exception: in pcsx2, the lilium inverse tonemapping shader was recognizing hdr in the picture, even though the hdr addon wasn’t working, giving me the same old “turn on hdr if your monitor supports it” message. I assume with dosbox, xemu, and scummvm my problem was opengl, but dolphin, pcsx2 and duckstation are set to vulkan and it’s the same thing. trying the nvidia dxgi swapchain thing didn’t change anything except it stopped making the screen dim after quitting until I turned hdr off and back on.
This is how I set everything up and it works for me:
I also get the error message in Reshade at the addon tab, that HDR support is not enabled, but it works nonetheless and HDR is activated from Windows side and the TV running in HDR.
And it works the same way in Duckstation and PCSX2 if I choose DirectX 11 or 12 instead of Vulkan and install Reshade with the correct API. DXGI Swapchain is also not neccessary for me.
It is important to install “Reshade with full add-on support” (not the “normal” version) and then during installation select Lilium’s AutoHDR add-on and the inverse tonemapping shader, but i guess you did this correctly.
In some emulators like Flycast for example, I have to manually override the “CSP_Override” setting like in this picture to get it to work:
As you may notice, the screenshots are a bit blown out, but that is because Lilium’s inverse tm is doing its thing and I have not found a way to accurately capture screenshots with HDR and correctly present them here.
Thank you, will try it out!
You’re welcome! I updated the post with a couple more tools.
The first issue I have is, that I can’t save screenshots in the .JXR format. I tried it with the Windows 11 screenshot capture function and also Nvidia Experience Screenshot utility. They both just save pictures in .png format. without any HDR information. Retroarch just crashes when I add it to Special K…
Also Lilium’s tonemapping is based on HDR SCRGB instead of HDR PQ, which may also plays a role here and I am not sure, if capturing inverse tonemapped pictures is done the same way as normal HDR pictures.
But I have a very simple solution for all of us. I just capture screenshots in SDR without HDR tonemapping and then if I want to view them with full brightness, I just put my display in HDR mode and then drag the Windows SDR brightness slider to maximum:
This way pictures look pretty much identical and bright, just as if I play them with Lilium’s inverse tonemapping.
Here is an example:
Put your display in HDR mode and SDR slider to maximum and it will look correct with only minor differences. Best is to download the picture and view it in fullscreen, as it gets even brighter then, for me at least.
Here is a comparison of pictures I just captured with my phone off my OLED.
This screenshot shows how the game runs with Lilium’s HDR tm:
And here is a snapshot saved in SDR, just viewed in HDR and the SDR slider to maximum:
The differences are small and I think it is the easiest way either to upload pictures in SDR and view them in HDR with the SDR slider to max or just take pictures off the screen with a camera while running the shader in HDR. For our purposes I think this is sufficient.
What GPU do you use? I use a GeForce GTX 1070.
To take HDR screenshots in .jxr format you need to open GeForce Experience, then click on the Settings (Gear icon) then check “Enable experimental features…”
In-Game Overlay also needs to be enabled.
Windows must be set to HDR for it to work. However if you start Windows in SDR Mode, then switch to HDR Mode then try to take a screenshot it wouldn’t work properly unless you toggle the In-Game-Overlay off then on again in GeForce Experience.
If Windows was started with HDR Mode enabled, the HDR capture in .jxr format should work.
Some nVidia Graphics cards can also record video in HDR, mine cannot. I get a message on attempting to record video that tells me it can’t be recorded in that mode.
That’s an easy way for me to know if I’m going to get proper HDR captures in .jxr format. Once I get that message when trying to record I know it’s going to work. If it starts recording, I know that GeForce Experience is still in SDR Mode.
I believe you but my OLED TV, which is my only HDR display is currently out of order so I can’t try it, however I’m able to watch HDR Mastered content using my own manual Tonemapping on my non HDR TV and it looks really good.
I was just contemplating if I could make a similar profile for game mode.
Here are my FakeHDR Settings if anyone else wants to try something like this:
TV: LG 55UB8300-UG
Input Label: PC
- Energy Saving: Off
- Picture Mode: isf Expert1
- Backlight: 100
- Contrast: 100
- Brightness: 50
- H Sharpness: 25
- V Sharpness: 25
- Color: 100
- Tint: R26
Expert Control
- Dynamic Contrast: Low
- Super Resolution: Off
- Color Gamut: Wide
- Edge Enhancer: On
- Color Filter: Off
- Gamma: 2.4
White Balance
- Color Temperature: Cool
- Method: 2 Points
- Pattern: Outer
- Points: High
- Red: 0
- Green: 0
- Blue: 0
Color Management System
- All Defaults <0>
Picture Option
- Black Level: Low
- LED Local Dimming: High
I have a RTX 4060 Ti.
That worked, thank you!
I just converted the JXR picture to SDR with this tool you posted:
Unfortunately the result is not good as you can see here:
Colors and brightness look very different than what Lilium’s tm is doing.
I can send you also the original XJR file, but I am not sure where to upload it, as all sites I tried don’t support pictures in JXR and this size.
With that you should also be able to record HDR video.
You’re welcome. I’m glad it worked!
That’s not necessary if capturing a .jxr using GeForce Experience as the .jxr file contains both the HDR image as well as a tonemapped SDR image as well in the same file. If you open the file with HDR disabled in Windows, it will correctly load the Tonemapped SDR version. The tonemapping seems pretty high school to me but of course it won’t look identical to the HDR version.
If you capture a .jxr using Windows Game Bar, it’s supposed to save an HDR .jxr and a separate tonemapped SDR .jpg (or .png can’t remember).
All of those tools and methods I posted produce differing results.
I use Mega, have you taken a look at my examples?
I just tried to record a video with Retroarch running the Lilium shader and that results in very weird colors etc.
But what works is recording with Retroarch’s inbuilt HDR instead of Reshade and Lilium. When I view the video file it looks exactly like it looks in RA with proper HDR. For presenting the shader on Youtube for example this should be a good solution.
There is something with Lilium’s inverse tonemapping, which results in a different video output. Maybe its due to the SCRGB tonemapping, I am not sure.
That works! It’s not 100% identical to the HDR version as you mentioned correctly, but colors look more correct now.
I have seen your examples, but without registration I can’t upload anything there. Did you have to register for uploading the images?
Yes, it’s just a cloud storage provider like Dropbox or OneDrive not a dedicated image hosting platform.
Those entities still have a lot of catching up to do.
You can try but after YouTube reencodes it and messes up the colour format, I don’t know if it will still look good.
I think YouTube only supports YCbCr 4:2:0 Chroma Compression for HDR Videos and these shaders require at least RGB Full 4:4:4 for proper subpixel colour addressing.
There are some Mask, Display Resolution and TVL combinations which might be compatible with 4:2:0 so you can experiment with those as well but I look forward to seeing your results.
Here are some pictures that I just took of your screenshots while viewing them on my FakeHDR TV.
https://mega.nz/folder/BFZXiSQS#_WewqlIOaQk7LrrVBlEfCg
Hopefully they should give you an idea of how good they and “normal” they look to me in person.
I had to lower the refresh rate to 30Hz in order to get the RGB 4:4:4 Full that is needed for them to look correct.
Ok, I think I will register at one point, if it’s free. Posting some Videos on Youtube I might also do in the future when I have some motivation to do it. But then I have to do it with Retroarch’s inbuilt Megatron HDR, which will not result in the same look as Lilium’s inverse tm and Reshade.
For Sony Megatron Colour Video Monitor you can try:
Display's Resolution - 1 (4K)
Resolution - 2 (800TVL)
in order to get around YCbCr 4:2:0 Chroma Compression Limitations.
Alternatively, you can use the Shadow Mask pattern.
you seem to have the lilium shader twice in your reshade parameters list there, I only have one and it’s always like your second one that says error and “only hdr color spaces supported”
Yes, I did notice that. I think it is because I had reinstalled Reshade a few times. One of the two is not active though (with the error message) and does not interfere with double tonemapping.
As for why it is not working for you I have no idea at the moment. Did you try to manually override the CSP setting? If you follow the instructions from my post #2475 it normally should work.
Is there a video tutorial or more in-depth tutorial on how to set this up outside of what is in the OP? I have an LG C2 with HDR enabled, I updated the shader in Retroarch and enabled crt-sony-megatron-hdr-pass.slang in the quick menu. I have my brightness at full on the TV. I have the picture setting set to standard on the TV. I also have game optimizer enabled on the TV. My TV’s brightness and contrast are set to 100, black level 50 and dynamic tone mapping is on. The image just looks super dark when I’m playing games.
When I go to the shader parameters, I dunno what to set peak luminance to. I tried 770 and it still looks very dark. What am I doing wrong?