Which framebuffer do shaders operate on?

Hello I’m new to Retroarch and it seems pretty great for NES games (using NEStopia core) but there’s something technical I would like to know about how the shader pipeline works.

Basically I want to know whether the shaders get applied to the game’s framebuffer BEFORE or AFTER the framebuffer has already been resized/stretched to suit the chosen aspect ratio?

My current thought process is that shaders wouldn’t work properly if they were operating on the framebuffer after it has already been stretched/resized to fill the screen aspect.

For instance the CRT shaders scanlines need to align with each row of pixels, and on my system the scanlines do in fact align with the game’s rows of pixels. So I’m thinking the shader pipeline must be taking place BEFORE all the aspect ratio stuff, then once all shader passes are complete, that final framebuffer gets handed downstream for final aspect ratio scaling by the emulator/retroarch? Is my understanding correct here?

Another example is the “interpolation/sharp-bilinear-2x-prescale” shader. On my system it is able to achieve non-integer scaling while keeping the image sharp like nearest-neighbour scaling but without the associated aliasing or shimmering that you would normally get when nearest-neighbour scaling to a non-integer resolution. It seems this would only work if it has access to the game’s native framebuffer prior to any scaling, then the shader it does nearest neighbour 2x integer scaling which fills most of my screen, then finally does a small amount of bilinear (soft) upscaling to fill the screen and correct the aspect ratio. The thing is I can only conceive this working like this if the shader pipeline is done BEFORE aspect ratio scaling/resizing. But I’m not 100% sure this is the case.

I don’t like it when things “just work” and I have no idea why. I must know!

Thanks

the answer is basically “at the same time.” You’re correct that you don’t want any scaling/stretching after the CRT effect, and you don’t want to scale/stretch before because you’ll end up breaking shaders that do pixel/neighbor comparison (like xBR et al).

So, the shader is the part that does the scaling/stretching. If you don’t use any shaders, it does a basic bilinear or nearest-neighbor scale/stretch using the GPU’s built-in transformations.

4 Likes

Very good to know that info, thanks.

So just make sure I’m 100% clear: when I load a CRT shader preset, the CRT shader first uspcales the NES’s native (unscaled) internal framebuffer to my display resolution for aspect ratio and screen-filling reasons, then after this it finally applies the CRT effects like scanlines etc.?

And if true, does this also work the same way when I enable integer scaling in Retroarch? i.e the CRT shader preset has access to Retroarch user settings and sees that I have integer scaling enabled in Retorarch, and does integer scaling followed by CRT effects?

And finally an observation on integer scaling - with no shaders enabled, and integer scaling enabled, and aspect ratio set to an option such as 4:3 or 8:7 or anything with a wider DAR than the NES’s native internal framebuffer, I observe that nearest neighbour scaling is used in the horizontal direction to widen the image for aspect ratio purposes. So integer scaling only seems to get the vertical direction pixel perfect, but the horizontal direction is not pixel perfect due to the aspect stretching and still has nearest neighbour scaling artefacts (shimmering, aliasing on pans). Is this how it’s supposed to be? I was a bit confused by it at first as I thought Retroarch would have used bilinear for the horizontal scaling to avoid the nearest neighbour artefacts on pans which are quite ugly. But as I mentioned in my previous post I can use the interpolation/sharp-bilinear-2x-prescale to avoid this issue altogether. I was just a bit surprised it required such a degree of user intervention to get a clean image, so at one point I thought it must be a fault of my system, but now I’m thinking no, Retroarch really does use nearest neighbour for all scaling until the user intervenes with shaders or setting the global scaling option to bilinear.

It does it at the same time. If you set it to integer scale, the CRT shader doesn’t know anything about the integer scale option, but the CRT effect is calculated to integer-scaled output because that’s what it’s outputting to.

RetroArch first handles the vertical axis and either stretches to fit or snaps to an integer, depending on your settings. Then, it stretches to the appropriate width based on your aspect ratio setting using either bilinear or nearest. If you want to do anything fancier, you have to bring a shader into the mix.

RetroArch doesn’t generally make a lot of decisions for users. Defaults are chosen to be safe and compatible rather than top performance, and almost anything related to video is handled through the programmable shader pipeline rather than hardcoding video options into the menu.

1 Like

Thanks I think I understand now - the shader takes the game’s unscaled framebuffer (or should I say texture) as the input, and produces an output texture the size of which is a function of the user’s display resolution, aspect, and integer scaling settings.

1 Like

So if the input texture is say 256x240 and the output texture is say 1440x1080, the CRT shader would be doing a “for each pixel in the output texture, set its colour to x” where x is a function of pixel(s) in the input texture and CRT effects, would this be correct?

2 Likes

Yes, you got it. Hit the nail on the head :slight_smile:

2 Likes

As you understand the simplest way Retroarch shader chain operates, let’s further think about the amazing power of this system:

Not only a funtion of a previous framebuffer, but even the previous of previous. I mean, you can access many framebuffers, each one being a pass in the preset. And not only that, you can even access all framebuffer chain from previous frame (previous in time, not only in space).

3 Likes