Shaders look different on second screen, despite the same resolution

So i have a 1080p monitor and a 4k TV, but i have configured the TV to be 1080p, for parity.

I often switch from one another but i noticed the same shader i use for most systems looks bad on the TV, despite displaying it with the same resolution. The scanlines are uneven and if i enable curvature, the circle moire appears and it looks awful.

On the monitor the shader looks fine.

How can i fix this so the shader looks good on the TV, without changing the resolution?

1 Like

Your TV is upscaling your 1080p input to 4K anyway. And usually TV upscalers are bad.

2 Likes

Disable scaling on the second screen. Also if there is a difference in the subpixel layout between both screens it’s not going to look the same no matter what you do.

CRT Shaders are not really one size fits all.

Also you have to make sure your TV is in “PC mode” and not applying any processing to the image.

Depending on the CRT Shader Preset, it might just look better if you leave the TV at 4K resolution.

1 Like

When the shader relies on subpixels (for the CRT mask emulation, for example,) even the best upscaler money can buy will give wrong results. When using CRT shaders, always use native resolution so that the shader does the upscale, not the display.

2 Likes

It’s possible to downscale on Linux in integers so you can treat 4k as 1080p. I don’t know if this is possible on Windows, there are at least ways to upscale a 1080p borderless window with nearest neighbor. I don’t know if that benefits performance if applied to a Retroarch Window, as opposed to applying shaders to native 4k. I would think it does, but I haven’t tried it yet.

1 Like

How can i do that? There isn’t any oprion in the TV settings for that. You mean from the drivers or RetroArch itself?

1 Like

I meant from the drivers. You also have to enable GPU Scaling for this to work. You’d end up with the image in the center surrounded by a large black border (on the 4K screen) though but at least it would be consistent.

Another thing you could try is DSR or VSR (in your graphics driver control panel) and run both screens at 4K. This should give you an optimal image on the 4K screen and possibly the highest quality downscaled image on the 1080p screen and it should look fairly consistent.

Of course you’d need enough GPU horsepower to push 4K.

There are other options as well, like running shader presets which don’t use any mask.

1 Like

Yes, i changed the scaling from “Display” to “GPU” (Nvidia drivers) but the screen didn’t get centered with borders, it was still full screen. There isn’t an option to turn it OFF.

However, i did a quick test with a game that had the most noticeable issues with the shader and it looks way better now. I think the issue might got fixed this way for me. :slightly_smiling_face:

Maybe the Nvidia drivers/cards are very good with scaling the image. Good to know from now on.

1 Like

On both AMD or NVidia, there is an option to disable Scaling. It might be called “Center”.

It’s not “Keep Aspect”.

Send a pic of your scaling options.

This is how the setting screen should look on nVIDIA. As you can see there is a “No Scaling” option.

On AMD it’s not much different, you might have to choose “Center” on AMD but I can’t remember off hand.

https://www.amd.com/en/support/kb/faq/dh-019

https://www.amd.com/en/support/kb/faq/dh2-034#:~:text=Introduced%20in%20Radeon™%20Software,sacrificing%20visual%20quality%20or%20performance.

You can definitely try the DSR (Dynamic Super Resolution)/VSR (Virtual Super Resolution) method at 4K resolution as well once you have the GPU performance.

I think you’ll like that as well.

1 Like

With the (nVidia) integer scaling option enabled the result should look good when 1920x1080 is scaled to a standard 4k resolution.

The downside is that your 4k display is treated as an ordinary 1080p one.

1 Like

Just wanted to add that this is for GeForce GTX 16-series or later GPUs.

1 Like