Need advices to gamma post calibration

Hi there, I would like to gamma-correct my presets, easy question :slight_smile:

Say I’m almost done with my shiny new preset, but I want its colors to match the intended levels, by now I added a function that splits the screen and on the right it shows the unprocesses source image, and on the left the image after the shader.

My target workflow would be to tune final shader parameters so that left matches right, provided that right is correct!

But, what i see on the right, the unprocessed image, seems washed out and i don’t think the game is supposed to look that way (there it has been corrected with an arbitrary gamma of 1.1, resembling more or less what grade does by default).

By contrast, following my taste, this seem more correct to me:

so the final question is: what should i do to the right side so that it can be like a good target for the left one?

linearize+delinearize? gamma-down? linearize+delinearize+gamma_down? how much?

Maybe it looks fine and it is just me?



What do the likes mean?

“Good question, i’d like to know” or maybe “They both look fine, your choice”… or? :slight_smile:


Preserving saturation is a good thing. Multiplication in any form could be very nice. This way, you can still lower output gamma to increase saturation and contrast a bit.


If you’re using raw emu output to a LCD with gamma of 2.2, it’s going to look washed out vs a CRT with gamma of 2.4-2.5. I think it makes sense to go ahead and do that much correction to the raw image and then try to calibrate the post-shader output (with the mask and scanlines, etc.) to that otherwise “pure” image.


“that much correction” :thinking:

So, what I am supposed to do is to target ~2.45 gamma on a display that is already giving me 2.2.

By trial and error (because my math…), the naked picture would need a push on the gamma of ~1.11?

This would be consistent with Grade’s defaults. (first shot in the opening post)



Yeah, I like a richer image, as well.

It’s not based on anything scientific, but the old SNES emu Super Sleuth used a “gamma ramp” function that looked really nice in the way it darkened some stuff but left others bright. I ported it to slang here:


Nice, i’ll give it a shot, thanks!


I’m not sure if it is because the unprocessed image viewed progressively on an LCD display might be vastly different from what the devs might have envisioned or encountered when developing and testing on actual hardware.

Some arcade games even had tinted screens in front of the CRT monitors.

We all know how GameBoy Advanced games had boosted brightness/Gamma Levels to account for the lack of quality of the LCD screen in the final device.

We are also aware of how for several systems developers deliberately took into consideration the various quirks and peculiarities of the NTSC signalling limitations and put them to great creative use.

If you take a preset that’s optimized for SNES and try to use it on CPS1 Arcade Games, you might find that the CPS1 games look a bit unnaturally bright or desaturated.

Same SNES preset on NES games might look oversaturated.

There are some CPS1 games which have a grey rectangle around the FBI warning, you can let that be your guide as to what is the correct gamma setting for those games and adjust your gamma until the grey box blends into the background.

It also helps to have in person experience to get a better idea of where things should fall though.

1 Like

The problem is that scanlines and masks introduce some highlight compression that in view of zfast_crt shader looked kinda bad/dimmed. It’s not gamma in strict terms but reverse gamma so the weight is on the highlights. I don’t know about crt-guest or other shaders, masks, etc but it should draw a similar curve.

Example exaggerated.

To compensate I do the opposite, thus recovering highlights. (exaggerated)

Usual CRT gamma in the other hand looks similar to this, dark weighted or with a “knee” so to speak. (not exagerated)

So what gamma to use? There are two gammas guest and host, CRT gamma is guest gamma so we try to replicate the developer’s CRT gamma. Host gamma is your (O)LED gamma.

Let’s say that CRTs gamma were fixed at 2.4 which is a common agreed value. SmartTV or Monitor gammas by default uncalibrated are a disaster as they don’t draw a nice power curve, but if any they approximate to 2.2. This value if calibrated would be ok for interior lights, it’s called gamma for offices, but I think it would also work for well lit rooms at night. Typically you want to calibrate your display to 2.4 if you are under dim light, just a few lamps on. If that’s the case that draws a system gamma from CRT guest to (O)LED host of 1.0.

So a third question arises… You set your display to 2.4 gamma if in dim lights or 2.2 gamma for office but, what surround environment was the game developed in? dim? office light? If it was higher than dim the developer would tend to compensate the “dim adjusted” CRT gamma of 2.4 with higher game color values due to the Bartleson-Breneman effect, in such case you would also need to account that and use a higher guest CRT gamma, in grade for example 2.50 is used but a study on developer design surround environments should be addressed first.


For what it’s worth I set my Gamma Correct (Gamma_C) in CRT-Guest-Advanced (and Mega Bezel Reflection Shader) to 0.90 from a default of 1.0 for CPS1 Games and that looks “right” for me.

1 Like