This is why gamma is so confusing.
- CRTs were never 2.50 gamma. They produce 2.35 gamma when accurately calibrated and measured.
- Encoding gamma (OETF) is not display gamma. (EOTF) There was actually no specification for display gamma until 2011. It was assumed that system gamma would be around 1.2 in a dark room.
- Content was never encoded at 2.35 gamma - or 2.50.
The BT.601 transfer function is closest to a 1.96 (0.51) gamma curve. Here is a graph comparing the three.
So for content encoded at 1.96 gamma, a system gamma of 1.2 means that display gamma should be 2.35.
To get from our input colors to linear gamma for SD consoles, we need to use 1.96 gamma (0.51) not 2.35 or 2.50 gamma.
For handheld systems like the GBA which used an LCD screen, it’s most likely encoded at 2.2 gamma (0.45)
Since this content was then intended to be viewed with a system gamma of 1.2 - at least for the SD consoles - our output gamma should be higher than our input.
If the input gamma is 1.96, an output of 1.96 should keep the overall image brightness the same.
A higher output gamma, such as 2.35, should then produce a darker image - because the input was 1.96
The processing is:
<encoding gamma> to linear gamma, blending, linear gamma to <target gamma>
If you specify your target gamma (2.35) as the input, then the source is not correctly linearized.
I understand why you set it up like this, because that’s probably closer to the conventions used in emulation, but the way it’s set up now is using input gamma as both encoding and target gamma, and output gamma as display gamma.
Input gamma should specify the encoding gamma, and output gamma should specify the target gamma.
For “typical” gamma controls, where you want to specify <display gamma> (2.20) and <target gamma> (2.50) it would probably be best to keep input and output the same for pixellate (1.96 for both) and, I guess, add another shader for that.
Or maybe you would just specify 2.20 as the input and 2.50 as the output. It’s probably not far off enough that it would matter - you’d still have most of the benefit of processing in linear light.
I know this all sounds really pedantic, and is making a simple request complicated, but it does matter.
If your encoding gamma is not correct for linear light processing, it means the blending is not performed correctly.
Here’s a comparison between 1.96 in/out and 5.00 in/out at 100% size from Symphony of the Night:

As you can hopefully see, the overall image brightness remains the same, but setting the encoding gamma too high means that bright pixels appear brighter or larger in size than dark pixels because it’s not being blended correctly - which is why you can’t specify your target gamma (2.50) as your input gamma.
Input gamma must be the encoding gamma.
I guess it might be possible to achieve a correct result by setting 1.96 as the input gamma and perhaps something 1.73 (rough estimate) as the output gamma to obtain the correct result for an approximate 2.35 gamma output from the shader, but that seems really confusing to me.
As I said though, gamma is confusing - especially when you’re dealing with linear light.