Compensating for lost brightness from RGB mask

I’m trying to replicate what a CRT actually does at the macro level (ie, extreme close-up). This means I need to add an RGB mask effect at 100% and “perfect scanlines” (which are a 30% reduction in brightness per line given the non-linear relationship between voltage and brightness on a CRT).

I’m using a 1080p display without HDR, with a peak brightness of around 350 cd/m2.

I’m using the aperture grille effect from the “dotmask” shader and scanlines from “zfast CRT”

I’m pretty sure there is no way to increase the mask strength to 100% (ie, decrease “mask dark” to 0.0) while still having enough spare brightness to compensate. The best I’ve been able to do is lower mask dark to 0.40 and raise “mask light” to 1.60. If I raise mask light more than this, I get clipping. If I lower mask dark more than this, I get an image that is unacceptably dim even with my LCD backlight at 100%. I may have simply reached the limit of what this monitor is capable of.

I’ve been playing with the “image adjustment” shader, but I just get clipping or crushed blacks if I play with contrast or black level.

Am I right to conclude that I just need to get a better monitor to accomplish what I’m trying to do? I want the mask to be at 100% strength, the scanlines to be a perfect 30% reduction in brightness, and the screen to reach ~300 Lux on a white screen.

I’m assuming that I just need HDR to do this right, which is a bit of a bummer since HDR is pretty much exclusive to 4K and higher resolutions. At 4K, the RGB mask is simulating a 720 TVL CRT, while at 1080p it’s simulating a 360 TVL CRT, which is what I prefer. So many compromises… le sigh.

1 Like

HDR isn’t going to help you insofar as RetroArch doesn’t have any way of activating HDR display modes. However, I would think an HDR-capable display should be bright enough to do what you’re wanting just by virtue of needing that brightness for meeting the HDR spec.

You can always run a 4K display at 1080p and it should look fine.

Yeah, that’s also what I was thinking.

As far as running 4K at 1080p, that should work! If we do that, then the mask is also getting scaled, so each RGB “phosphor” is a vertical line 2px wide (real pixels), so you wind up with 360 TVL on a 4K display. Yay!

However, downscaling the image like this may introduce input lag vs running the display at 4K. It’s always best to run at native resolution for minimum input latency, right?

If we want to keep the 4K display at its native resolution AND make it so the RGB mask simulates a 360 TVL CRT, then it seems like the obvious solution is to alter the shader code for the aperture grille effect so that each RGB phosphor is 2px wide when running at 4K resolution (or have a user controllable option for adjusting the width of the phosphors from 1px to 2px). Essentially, you’d be giving the user the option of simulating 360 TVL or 720 TVL.

you should be able to adjust the mask code, yeah, but scaling a 1080p signal to 4K usually doesn’t impart much if any latency. For example, here’s the latency figures from a TCL TV I’ve been looking at that’s natively 4K:

latency

1 Like

What’s interesting is that it has lower input latency in 1080p than it does at 4K; wouldn’t have expected that.