Pixellate.cg in linear light?

I was wondering if it might be possible to update the pixellate shader to blend in linear light, as it seems to be blending in gamma light.

I took a screenshot from the Super Mario World title screen as a quick test. Pixellate.cg blending between red and green results in an ugly dark blend between the two colors:

That’s a close match to what I see in Photoshop if I apply a blur between the two colors in gamma light: (a 1px box blur ended up with 2px of blending, ignore that)

If I have Photoshop apply the same blur in linear light, the result is more accurate to how the two colors should actually blend, and generally more pleasing to look at:

Updating pixellate.cg to blend in linear light could really improve an already great shader. Unfortunately how to do this is beyond my technical abilities. I don’t know how to write a shader, and layering several shaders didn’t seem to work. I believe what you would have to do for this would to be convert from 2.2 to 1.0 gamma, blend, and then convert back to 2.2 gamma, but I’m not certain. If it’s possible, it would probably be useful to have gamma options in the shader. (input and output)

This is very easy to do.

I made a special pixellate with those options. Get it in attachment.

I created three params:

Gamma ON/OFF - this is a switch to use or don’t use gamma. It’s OFF by default, which means pixellate will work as the old one. Input Gamma - default value is 2.2 (in crt shaders the value used is 2.4 or 2.5) Output Gamme - default value is 2.2 (this shouldn’t be changed)

I couldn’t see any improvements. If you see any, post some screenshots here.

That was quick, thank you! It’s one of those subtle but worthwhile changes.

Gamma Light:

Linear Light:

Red in particular blends much better against the green now.

The gamma parameters in the shader don’t behave as I would expect though. If my input is 2.50 gamma, converted to linear (1.0 gamma), blended, and then output as 2.20 gamma, I would expect the image to get lighter. (lower output gamma = lighter image) If my input is 2.20 gamma, converted to linear, blended, and output at 2.40 gamma, then I would expect the image to get darker. I’m currently seeing the opposite behavior.

Try this, if that’s what you want:

(drop it in retoarch/shaders/cgp/)

edit: it’s bad, same result as stock.cgp which makes it useless :stuck_out_tongue: I removed it.

I was trying to use shaders\srgb-helpers\ linearize passes…

[QUOTE=larch1991;37128] The gamma parameters in the shader don’t behave as I would expect though. If my input is 2.50 gamma, converted to linear (1.0 gamma), blended, and then output as 2.20 gamma, I would expect the image to get lighter. (lower output gamma = lighter image) If my input is 2.20 gamma, converted to linear, blended, and output at 2.40 gamma, then I would expect the image to get darker. I’m currently seeing the opposite behavior.[/QUOTE]

No, they’re right. It’s your perspective that’s different from mine.

InputGamma is the supposed gamma the game colors were encoded. So, if they were intended to be played in CRT displays, the color values were encoded using gamma between 2.35 and 2.5.

What the shader does is bring the colors from 2.5 gamma to 1.0 (linear light). Then it process the pixels and, at the final, convert them back to the desired gamma display. As the majority of the displays where Retroarch will be used are flat panels, most of them LCDs, we use gamma 2.2 as output.

[QUOTE=Hyllian;37130]No, they’re right. It’s your perspective that’s different from mine.

InputGamma is the supposed gamma the game colors were encoded. So, if they were intended to be played in CRT displays, the color values were encoded using gamma between 2.35 and 2.5.

What the shader does is bring the colors from 2.5 gamma to 1.0 (linear light). Then it process the pixels and, at the final, convert them back to the desired gamma display. As the majority of the displays where Retroarch will be used are flat panels, most of them LCDs, we use gamma 2.2 as output.[/QUOTE]I think we actually mean the same thing, but you see input and output gamma as “display gammas” whereas I see input gamma as “encoding gamma” and output as “target gamma”.

So with the example of Input Gamma = 2.50, Output Gamma = 2.20 You see that as trying to recreate a 2.50 gamma display on a 2.20 gamma display. (image gets darker)

Whereas I’m seeing that as: Encoding gamma = 2.50 - “undo” this to get an accurate conversion to 1.0 gamma Target gamma = 2.20 - convert from 1.0 gamma to 2.20 gamma. Since the encoding gamma was higher than the target gamma, the image should get lighter.

If I know that my display is calibrated to 2.20 gamma, and I want the output to be 2.50 gamma, I might set the input as 2.20 and the output as 2.50 (though that’s not technically correct, since encoding gamma != display gamma) Setting the input to 2.50 means that the conversion to linear light would not be correct, because that content certainly will not have been encoded to 2.50, even if it was intended to be viewed at 2.50

Now I’ve mainly worked with HD so I’d have to look over the specs to see if SD differs, but it’s actually more like encode gamma = 1.96, and display gamma = 2.35 (system gamma = 1.2) Other (incorrect, in my opinion) sources may cite encode gamma as 2.20 and system gamma as 1.125, for a display gamma of 2.475. But it was definitely never encoded at 2.50 gamma. That would result in an incorrect transform to linear light.

I hope that makes sense. Gamma can be a very confusing topic and the specs have been fairly lax until recently. As it is, the shader still makes a good improvement to the quality of pixellate, but I hope you can change this.

It gets darker, because the inverse conversion is done by color ^ (1.0 / 2.2). And (1.0/2.2) =~ 0.45.

Well, that’s not exactly the reason why you see a darker picture. The reason is that the colors were encoded to be seen in a 2.5 gamma display. When you see them in a 2.2 gamma display, the colors will certainly appear lighter than it was supposed to look. So when you convert the colors from 2.5 to 2.2 gamma, the 2.2 display will show darker colors when compared to the original 2.5 colors showed in it.

This is why gamma is so confusing.

  1. CRTs were never 2.50 gamma. They produce 2.35 gamma when accurately calibrated and measured.
  2. Encoding gamma (OETF) is not display gamma. (EOTF) There was actually no specification for display gamma until 2011. It was assumed that system gamma would be around 1.2 in a dark room.
  3. Content was never encoded at 2.35 gamma - or 2.50.

The BT.601 transfer function is closest to a 1.96 (0.51) gamma curve. Here is a graph comparing the three. So for content encoded at 1.96 gamma, a system gamma of 1.2 means that display gamma should be 2.35.

To get from our input colors to linear gamma for SD consoles, we need to use 1.96 gamma (0.51) not 2.35 or 2.50 gamma. For handheld systems like the GBA which used an LCD screen, it’s most likely encoded at 2.2 gamma (0.45)

Since this content was then intended to be viewed with a system gamma of 1.2 - at least for the SD consoles - our output gamma should be higher than our input. If the input gamma is 1.96, an output of 1.96 should keep the overall image brightness the same. A higher output gamma, such as 2.35, should then produce a darker image - because the input was 1.96

The processing is: <encoding gamma> to linear gamma, blending, linear gamma to <target gamma>

If you specify your target gamma (2.35) as the input, then the source is not correctly linearized. I understand why you set it up like this, because that’s probably closer to the conventions used in emulation, but the way it’s set up now is using input gamma as both encoding and target gamma, and output gamma as display gamma. Input gamma should specify the encoding gamma, and output gamma should specify the target gamma.

For “typical” gamma controls, where you want to specify <display gamma> (2.20) and <target gamma> (2.50) it would probably be best to keep input and output the same for pixellate (1.96 for both) and, I guess, add another shader for that. Or maybe you would just specify 2.20 as the input and 2.50 as the output. It’s probably not far off enough that it would matter - you’d still have most of the benefit of processing in linear light.

I know this all sounds really pedantic, and is making a simple request complicated, but it does matter. If your encoding gamma is not correct for linear light processing, it means the blending is not performed correctly.

Here’s a comparison between 1.96 in/out and 5.00 in/out at 100% size from Symphony of the Night:

As you can hopefully see, the overall image brightness remains the same, but setting the encoding gamma too high means that bright pixels appear brighter or larger in size than dark pixels because it’s not being blended correctly - which is why you can’t specify your target gamma (2.50) as your input gamma. Input gamma must be the encoding gamma.

I guess it might be possible to achieve a correct result by setting 1.96 as the input gamma and perhaps something 1.73 (rough estimate) as the output gamma to obtain the correct result for an approximate 2.35 gamma output from the shader, but that seems really confusing to me. As I said though, gamma is confusing - especially when you’re dealing with linear light.

Every time I read about gamma, the explanation is different. Well, the params inside the shader are configurable, if you feel confortable using gamma 1.96, change them.

Isn’t it supposed to blend based on how far into the physical pixel the original vectorized upscaled pixel is? In that case, there should only be 1 correct answer for the resulting physical pixel provided by the shader. The blend should be closer to red if there’s a greater proportion of red within that pixel space than green, right?

I agree that it’s probably simpler just to leave the gamma correction out of the pixellate shader and just linearize/unlinearize it assuming the user will handle gamma correction elsewhere.

The typical way to quick-and-dirty linearize is: float4 linear = pow(tex2D(texture, texCoord).rgb, 2.2);

I tried doing a simple shader that linearizes (using either 2.2 or 1.96), adds to itself and divides by 2, and then re-curves, which should put out exactly the same colors as an unadjusted image if it were properly linearized. In this case, 2.2 is identical to the unadjusted image while 1.96 is noticeably darker:

Yes, and that’s what is done.

I, too, have a feeling that changing the gamma may break the right pixel proportion.

[QUOTE=Lex;37146]Isn’t it supposed to blend based on how far into the physical pixel the original vectorized upscaled pixel is? In that case, there should only be 1 correct answer for the resulting physical pixel provided by the shader. The blend should be closer to red if there’s a greater proportion of red within that pixel space than green, right?[/QUOTE]If only! The problem is that you cant do a perceptually even blending in gamma-light. You need to convert it to linear-light to do the blend, and then back to gamma light for it to look correct. If you perform it in gamma light, it just picks the value at 50% between the two based on the numbers, which is not how we perceive color.

[QUOTE=hunterk;37147]I agree that it’s probably simpler just to leave the gamma correction out of the pixellate shader and just linearize/unlinearize it assuming the user will handle gamma correction elsewhere.

The typical way to quick-and-dirty linearize is: float4 linear = pow(tex2D(texture, texCoord).rgb, 2.2);

I tried doing a simple shader that linearizes (using either 2.2 or 1.96), adds to itself and divides by 2, and then re-curves, which should put out exactly the same colors as an unadjusted image if it were properly linearized. In this case, 2.2 is identical to the unadjusted image while 1.96 is noticeably darker: http://i.imgur.com/HcSbFPX.gif[/QUOTE]Hmm, I’m not sure what’s going on then. Even if the conversion to linear light is not correct, if you’re transforming it back to gamma light using the same value as the input, I’m not sure why the image would change like that. I guess it’s possible that the emulator’s output is 2.2 gamma?

So perhaps only linearizing using 2.2 gamma in/out without any options would work best for most people. Keeps it nice and simple too.

[QUOTE=Hyllian;37148]Yes, and that’s what is done. I, too, have a feeling that changing the gamma may break the right pixel proportion.[/QUOTE] Well it’s supposed to change, because now it should be based on the perceptual values instead of numerical. That’s why it’s critical to use the correct value as the input gamma though.

[QUOTE=larch1991;37141]This is why gamma is so confusing.

  1. CRTs were never 2.50 gamma. They produce 2.35 gamma when accurately calibrated and measured.
  2. Encoding gamma (OETF) is not display gamma. (EOTF) There was actually no specification for display gamma until 2011. It was assumed that system gamma would be around 1.2 in a dark room.
  3. Content was never encoded at 2.35 gamma - or 2.50. [/QUOTE] For most of the applications here, there was no OETF. Pixel art was designed on computers and what mattered to the artists was how it looked on a television; there was never an image captured by a sensor and no “encoding” as such. Therefore only two functions matter: the original historical EOTF, whether it was for a CRT television or a handheld LCD (called input gamma in this shader), and the EOTF of the user’s display (usually presumed to be close to the sRGB curve; called output gamma in this case). The standard procedure to reproduce the original look is the following: [ol] [li] Apply the historical EOTF to obtain the original displayed image in linear light. [/li][li] Perform whatever linear-light transformations you desire. [/li][li] Apply the inverse of the user’s EOTF. [/li][li] The user’s display applies its EOTF. This undoes the previous step, obtaining the (transformed) image as displayed on the historical display. [/li][/ol]

Yes, it would not strictly be “OETF” since it’s not being filmed, but generally that refers to the “encoding gamma”. The issue is that you would typically not be viewing with a system gamma which is linear; linear meaning that the display gamma is equal to the encoding gamma. Display gamma would typically be higher than encoding gamma (system gamma > 1.0) and it is almost a certainty that no display was ever calibrated to the sRGB response curve. (assume 2.2 gamma for an “sRGB” display) Even if the display was running at 2.35 gamma (CRT), the encoding gamma would not be 2.35, because your system gamma would typically be ~1.2

It’s still not entirely clear to me why Hyllian’s test gave the results that it did. Going from 1.96 to 2.35 gamma looks more “correct” to me for the output than going from 2.20 to 2.50 gamma if we’re trying to recreate a CRT response curve. That being said, I think I’ve now got the right values to use with this shader to have a 1.96 encode gamma, and output 2.35 gamma on a 2.22 gamma display, so my own needs are met.

But if you want to have a shader that other people will understand, I think that the answer to all of this is to add a third variable. Then we could have Encoding Gamma, Display Gamma, and Target Gamma - which would eliminate any confusion. For encoding gamma, you could specify 1.96, 2.20, or whatever you think is “accurate”. Personally I still think that the former should be correct, but perhaps it could be the latter due to this being emulation.

With the display gamma variable, you tell the shader what your display’s current gamma is. This should default to 2.2 gamma, since most displays should be calibrated to that. Then you can specify a target gamma - either 2.2 if you want an unmodified output, or something else if you want to change the gamma. (e.g. 2.5)

Have everything default to 2.2 and there won’t be any surprises for people. Adjusting the target gamma from that will work as people expect. This would leave the encoding/display gammas available for people that want that control, without having to calculate what value should result in the correct output for their display.