Whoops, i thought i mentioned that one when you added scRGB. Guess not.
The Vulkan HDR options seem to (permanently) come back if you enable HDR10/scRGB in D3D11/12 and switch back to Vulkan.
Whoops, i thought i mentioned that one when you added scRGB. Guess not.
The Vulkan HDR options seem to (permanently) come back if you enable HDR10/scRGB in D3D11/12 and switch back to Vulkan.
Pictures to come when i have the time to get them properly sorted for posting.
Edit: https://imgur.com/a/Kj9nR11
Taken using 2.8.0 as requested. The “8K” set got messed up somehow and wasn’t usable. I will have to retake those at a later date.
Thank you very much. I’ll have to do something similar using my miniLED or IPS TV to use as a point of reference for which Mask Types and TVLs work well on WOLEDs vs IPS/VA LCDs. Feel free to make your own comments on how the various Masks you captured and shared hold up to what the shader is really trying to render.
Thanks, so this was a bug that I thought I fixed a while ago. It now appears to only happen in certain situations - maybe its fine in debug? As it seems to work fine when I build locally on the same laptop. In this instance I can go into d3d12 and turn it on but it will not turn the Vulkan HDR on too.
Another bug you might be interested in is that in both scRGB and HDR10 we look to have colour warping bugs when we go into high peak luminance values (this is with scanlines and masks off - so just vanilla HDR) in Vulkan Im seeing in scRGB both green and blue being saturated to white and in HDR Im seeing blue being saturated to white.
This is interesting as my other devices dont show this warping: both my Android devices (phone - 2800nits and tablet - 1000nits) are fine and so is my Windows PC. In D3D12 its even odder: at precisely 300 peak luminance the whole colour balance flips. Ive yet to confirm this D3D12 bug in my local build and may just affect the nightly build as the Vulkan menu bug does but something is certainly up.
Hmm you know what Im wondering whether this laptop screen is displaying 4:4:4 - actually that may well be it.
Ok so this is a bit wild Im getting all sorts of wierd and wonderful behaviour between drivers as well - AMD driers have totally different behaviour to the Nvidia driver - even after I try to align them with HDR, bit rate and full chroma. This may take a bit of time to unravel…
Hi, What is the ideal TVL for 1080p when using Sony Megatron? And if I want to simulate a high-end consumer CRT (like a Trinitron), what TVL range should I use? Thanks!
This is all subjective.
I’d guess probably around 600TVL.
I would recommend 300TVL with the shader parameter resolution set to 1080p for a 1080p TV.
Speaking ideally, 600TVL at 4K seems to generate phosphor patterns that are a much closer (but not exact) match for the late consumer Trinitron models from the early to mid 00s. I also personally prefer and use 600TVL the grand majority of the time on my 4K display. Unfortunately 1080p just doesn’t have enough resolution to properly resolve the 600TVL setting’s simulated phosphor mask, and i think that the final image is quite lacking in terms of clarity and sharpness as a result.
300TVL/1080p is coarser, and the simulated phosphors may be too large for your eyes to properly blend depending on the size of your screen, but the clarity and sharpness at which Megatron excels is maintained.
Last night I committed numerous fixes and changes: the headline change is removal of Peak Luminance and Paper White Luminance from the HDR menu and replaced with a simple Brightness option. This effectively means your physical displays brightness setting is now the peak luminance control and paper white luminance is controlled by the HDR menu’s brightness setting.
Why did I do this?
Numerous reasons:
a) Simplification - users were constantly asking what they needed to set Peak Nits to and what value to use - sustained, peak etc
b) This brings HDR10 inline with scRGB in how it works that means more similar behaviour (although fundamentally it will still have different behaviour just because of the way they work)
c) RTings is now behind a paywall and so its no longer relatively simple to find your displays peak luminance.
d) Its better to keep inverse tonemapping to the custom shaders that are more complex and provide more fine grain control
So if you are using the new HDR shader parameters you need to use the new parameters: MaxNits has been removed - you need to expose peak luminance in your custom shader parameters if you need it. PaperWhiteNits has been renamed to the more general BrightnessNits and maps to the single Brightness control in the menu - a value that is in nits.
I also fixed metadata setting in DirectX - Microsoft for a while has said dont set them and I was seeing it cause issues in my laptop by skewing colours - weirdly around peak nits of 300.
Also fixed is support for multiple GPUs across all drivers ie. laptops with an integrated GPU and a discrete GPU.
Plus a few other minor bits and bobs
We now have two setup strategies on how to set up HDR brightness for your display:
OR
Why 80 nits in the first strategy? Its the rec 2020 for standard white or SDR white so going above 80 moves you into HDR territory. Its up to you and your display by how much you should go above that value.
Effectively in strategy 1 we’re treating your physical displays brightness level as ‘Peak Luminance’ and the RetroArch’s Brightness settings as ‘Paper White Luminance’ and in strategy 2 we’re treating your physical displays brightness level as ‘Paper White Luminance’ and the RetroArch’s Brightness settings as ‘Peak Luminance’.
For me this is a noticeable downgrade as the pure white specular highlights hurt my eyes. I liked lowering the Peak to 800 from 1000 just to cool off the intense whites of lights in RE2, which is otherwise pretty dark… (2300 nit peak brightness panel here)
Eyestrain is real, but it’s overall minor, I can adjust…but why can’t we have a Menu Brightness setting in there? The menu causes eyestrain in HDR mode and I’d really like to dial it down to at least the Paper White nits.
And the Save Slot hotkeys still won’t loop anymore. Goes down to -1 but doesn’t go down from 0 to 999.
I’m using 4K in the settings and 800 TVL with an aperture grille mask, and it seems to look good at 1080p.
If you’re using the 4K 800TVL setting at 1080p resolution, the resultant phosphor/Mask effect won’t be 800TVL.
It’s good to post and share screenshots and photos when discussing these things so others can see what you see instead of having to visualize.
It looks like the EnableHDR uniform does not work anymore. What is its replacement?
There are a number of HDR monitors that don’t allow brightness to be adjusted in HDR mode, FYI.
That should work - maybe I broke it - Ill take a look tonight
As I say your peak luminance is now your display’s brightness setting but I hear what you’re saying and its now relatively trivial to have a menu only brightness because Ive separated out the menu rendering to a different render target. But the issue is the menu plumbing its quite a bit of work touching 30 odd files. So its definitely a feature request everyone wants but where do we want it? In the Settings->Video->HDR menu or in the Settings->User Interface->Appearance where all the menu visual settings are? At this point Ill defer to @hunterk - what do you think which place would be best?
As for the displays HDR brightness - I own a display that doesn’t support HDR brightness but on that display (at least) its not a problem as the brightness setting is all you need.
Ah just looked yes we renamed it to HDRMode when we introduced scRGB. So the names to update in your shader are:
EnableHDR → HDRMode
PaperWhiteNits → BrightnessNits
HDRMode is now a uint (rather than EnableHDR which was a float) and 0 = off, 1 = HDR10, 2 = scRGB
Also you’ll need to add a MaxNits yourself as a custom shader parameter like you normally do.
Fyi: the current nightly still respects the “video_hdr_max_nits” setting in the retroarch.cfg file, despite that setting no longer being exposed in the menus. This can still be used to control UI brightness as i have previously described.
I recommend against this implementation in the absolute strongest terms possible. It runs fundamentally counter to how HDR is meant to work on the most basic philosophical level, and it will result in innumerable unforeseeable issues that vary by display.
For just one working example of the sorts of issues this will cause: in HDR, LG’s WOLED TVs always need to have their OLED Pixel Brightness at 100, Adjust Contrast at 100, and Screen Brightness at 50, or tone mapping will be broken to one degree or another.
If you want to simplify RetroArch’s HDR menu, i would instead recommend tying both “Peak Luminance”/video_hdr_max_nits and “Paper White Luminance”/video_hdr_paper_white_nits to a single shared “Equivalent SDR Luminance” menu setting (or whatever you want to call it). They should be set to the same value anyway for SDR matching purposes, particularly with scanlines off.
How do I post screenshots here without losing quality?