Should I turn hardware bilinear filtering on or off?

Do RetroArch’s shaders expect hardware Bilinear Filtering to be turned on or off?

Yes, I did see this comment:

# Smoothens picture with bilinear filtering. Should be disabled if using pixel shaders.
# video_smooth = true

But I was experimenting with the Pixellate shader last night, and it was clearly designed to be used with Bilinear Filtering turned on.

It depends on the shader. Some expect bilinear, some expect nearest. xBR et al require nearest, most of the CRT shaders don’t care either way. Many of the ‘retro’/pixellate shaders require bilinear.

1 Like

I thought loading a shader always disables RA’s bilinear filtering though? So it should look the same no matter if the option is enabled or not. I think only settings like filter_linear0 in the shader’s preset file can enable that kind of filtering to go along with the shader.

yep, that is correct. The video setting basically only applies to the “don’t care” setting, I think.

Personally, I never understood why this was enabled by default ; it simply looks awful. First thing I disable and use a CR shader instead.

With bilinear filtering and integer scaling off you get nasty uneven pixels. I guess most people would prefer slightly blurry scaling that uses the full vertical area of their display rather than sharp integer scaling with black bars. So that’s why bilinear is enabled by default. Shaders like pixellate or sharp-bilinear are better non-integer scaling options overall, but not all platforms support them or can run them fullspeed.

i don’t use bilinear filtering or integer scale and at 1080p and TV distance I really struggle to notice any scaling artefacts. it’s easy to notice that horrible blur of bilinear, though :stuck_out_tongue: