New CRT shader from Guest + CRT Guest Advanced updates

I was thinking maybe one last thing about gamma could be considered.

If I look at the wiki page for sRGB (https://en.wikipedia.org/wiki/SRGB) I see that the transfer function is actually only power gamma (2.4) from a certain luminance threshhold, below which it is linear (you already are aware of these functions, but just for discussion purpose).

The actual sRGB forward transformation is:

and the reverse transformation is:

So I was thinking since the user’s LCD displays these days use sRGB gamma, it may be more correct if the shader encoding function (gamma_out) doesn’t use the current pure power gamma, but uses above sRGB transfer function?

Since these sRGB transfer functions seem trivial to implement, it may be an interesting exercise to implement a switch for both the “gamma correction” and the “gamma_out”, making both switchable between pure power and sRGB gamma? That way we can observe how both power and sRGB transfer functions impact color fidelity in the low range.

I guess gamma_input (the decoding to linear space) should always use pure power, since that is what it was originally “encoded” at?

1 Like

There are reasons for such procedures indeed, also rooting a bit in 32 bit per color buffers and interpolation of very dark colors. But with libretro float frame buffers can be used, which opens a nice way for different gamma combinations. Sure, there are some philosophical issues with representable numbers in an 16 bit floating point environment. For exampe, value A is quite small and representable, value B is large and representable, but happens that A + B = B (A+B is not representable) in a limited floating point environment, because of the numerical nature of the implementation.

But the story itself is quite old even for Libretro shaders.

If i would use sRGB buffers after linearization, then i should use the transformation, otherwise floating framebuffers do quite fine.

1 Like

Thanks for the answer. Only I seem to miss your point.

You seem to be saying that it doesn’t matter if you use the power function for encoding gamma at the end of your shader or use sRGB function (which is what LCD user monitor decodes at)?

So are you saying that it doesn’t matter if you do your current conversion/encoding at end of shader with the power function:

u=(Color/255)
Power function: Color=(u^(Gamma_out))*255

Or you would do the sRGB gamma conversion situation 2 at end of shader

u =(color / 255)

If u < 0.00313
Color = (12.92u)*255
Else
Color = (1.055u^(1/2.4) - 0.055)*255

I think grade.slang has a similar switch, but I think it would be more accurate if this switch would be implemented at the stages in your shader, as mentioned in my previous post.

So are we not understanding each other? In my mind this is simple Color manipulation to accurately match the different gamma’s. I.e raw emulator values are power gamma encoded (since created in era of CRT), but what you send to user’s LCD display is decoded with sRGB gamma function. So what goes out to the used monitor should be encoded as such.

Is Dogway with his grade shader wrong on this also then?

Could you in light of the above give some more explanation why you think these conversions are not relevant in the context of your shader?

1 Like

I belive Retroarch adds a layer in between, which converts to the RGBA 8888 format. Monitor is set-and-forget at best.

I’m saying that the linearization / delinearization is limited to 2.4/2.4 with the sRGB conversion functions, but i like more variety.

If you use the same procedure for different gamma values it creates a discontinuity.

And we are also talking about emulating systems with 5bit or less per color component, so float framebuffers do just fine. I didn’t say they were perfect.

I think i answered this a couple of times already. :wink:

Edit: after athought, the 16bit sfloat format should hold the water in most cases. :grin:

3 Likes

I also did a quick test with 8 grayscale colors from 1./255. to 8./255. and multiplied the delinearised result with 32. for better visibility.

crt-guest-dr-venom with 2.4/2.4 gamma settings.

All pixels were still visible at 3.0/3.0, the first was crushed at 3.05/3.05.

It’s interesting to know though. :wink:

3 Likes

I understand your believe / assumption. Unfortunately a believe without backup is just that. The “problem” is we keep assuming. When I asked about it, @hunterk answered sRGB space is NOT set on RA’s side. ( https://forums.libretro.com/t/dogways-grading-shader-slang/27148/375 ) . Dogway who is pretty well versed in color theory has also been forced to make assumptions, because there are no clear answers. And without clear answers, we could be dead wrong on the correct gamma adaption for Vintage CRT ouput to emulating that on our current LCD sRGB monitors.

Why is the above relevant? Because gamma mismatch affects the shape of the Gray-Scale and it has a major impact on not only image brightness and contrast, but also on hue and color saturation! In other words a gamma mismatch cause the intention and vision of the gfx artists of vintage games to be lost.

So just to entertain, the following are my observations.

The Poynton paper says the native gamma of CRT is power gamma with y between 2.35 to 2.55. (Remember gamma of 2.2 on PVM’s is NOT native CRT gamma, it is engineered for studio purposes). Let’s assume a specific value 2.5:

I.e. Brightness = (normalized luminance value) ^ 2.50

Poynton notes that the brightness perception of the human eye is approximately the inverse of the CRT power function. He doesn’t go into details but if you search for this, the theory is there is roughly a “dark adapted” and “bright adapted” eye. It depends on whether you’ve been sitting for an half hour or longer in a dim surround room (dark adapted) or well lighted room (bright adapted) and both can roughly be described by these functions:

Dark adapted eye : Perceived brightness = (normalized luminance value) ^ 0.4

Bright adapted eye: Perceived brightness = (normalized luminance value) ^ 0.5

As you’ll quickly see the dark adapted eye is exactly the inverse function of a CRT gamma function (1/0.4=2.5).

So suppose most games were developed in dim surround (game developers loved working at night :wink: ).

Conclusion / Assumption 1: : In the 80’s and 90’s there was NO layer that did gamma encoding. The eyes response is approximately the inverse of a CRT gamma function. So the perception end-to-end is completely linear. It was indeed a fortunate coincidence as poynton notes indeed!

So that’s the theory on CRT, now to our emulation chain:

Our eye’s response is still the same as it was in the 80’s / 90’s: Dark adapted eye : Perceived brightness = (normalized luminance value) ^ 0.4

But HEY something did change. Our current monitors DON’T have a 2.5 CRT power gamma. No they have an “average gamma” of 2.2 mostly?

Conclusion / Assumption 2: For 80’s and 90’s vintage system there’s a mismatch in emulated color luminance for R, G and B, because Vintage CRTs were at Power gamma 2.35 to 2.55 and our current LCD monitors are at gamma 2.2 average.

Conclusion / Assumption 3: emulated vintage games from the 80’s and 90’s on Retroarch have a color luminance mismatch on R, G, and B that at worst is 2.55/2.2= 1.16 times too bright, or least worse 2.35 / 2.2 = 1.07 times too bright.

Since gamma mismatch affects the shape of the Gray-Scale it has a major impact on not only image brightness and contrast, but also on hue and color saturation! There’s a noticable mismatch between what the CRT game developer saw on his screen in the 80’s and 90’s and what we are now seeing through emulation. It gets borked!

So now tell me how we adapt for this difference in the emulator chain?

  1. Retroarch should do no gamma encoding in the gfx output anywhere. Who can confirm this is really the case?
  2. If 1: then the shader output should be calibrated when using -at least- a gamma correction of 2.35/2.2=1.07. (In current development version of guest-venom this means a “gamma correction” that’s actually the inverse value 1/1.07, so a setting for “gamma correction” by default of 0.935 or lower to be accurate for 80’s and 90’s vintage games, when displayed on an LCD monitor with 2.2 gamma.) Right?
  3. If not 1: Who knows where the gray scale curve gets ripped by some gamma correction nobody can tell what it actually is? Again we shouldn’t settle for believe or assumption here, someone with authority on RA’s core code should be able to confirm.

So who else finds it useful to get to the bottom of this?

If the output of CRT gamma was 2.50 then you want an absolute measured output of gamma 2.50 on your LCD after all things considered (even assuming your display is calibrated to 2.2). That means that digital linearization and gamma encoding should be cancelled each other as I did while matching your calibrated display gamma.

For the analogue color work I use the SMPTE-C transfer function which is similar to the sRGB one but with a different threshold for the linear part. When you use the moncurve functions in grade the disparity in gamma doesn’t create a discontinuity.

I don’t know about scanlines and its requisites but if it keeps gamma unaltered/cancelled (other than scanlines related gamma) it should honor grade’s gamma.

1 Like

RetroArch doesn’t really do anything with gamma, for better or for worse. It really only comes into play as far as the framebuffers are concerned, otherwise everything is ~2.2.

And if you overscale or underscale the shader last pass? Or the rotations with arcade games/cores? You could think about how can the ‘fitting image’ happen, by magic or by applying an additional buffer? Nevertheless, the monitor really doesn’t care about internal gamma processing. :wink:

Emulated cores deliver the frames if, and i mentioned this before, the non-shaded output looks fine to you color-wise, then my crt shader shouldn’t do corrections. If the input frames are off, then a simple correcture should do it on the pre-shader level. Simple as that. There is really no need for every shader to include the variety of corrections, which consider a large set of different crt displays and emulated systems.

It’s not relevant, because the default input color space for shaders is relative gamma 1.0, and the default output color space for my shader is also 1.0. It can do a lot of stuff, even different gamma internal processing and a sort of correction, but in this case the layers after the shader plus monitor will consider the relative gamma 1.0 output.

I can improve my gamma function, but it will still be a gamma function with possible gamma combinations. I think it’s common sense to use the available shader parameters and tweak to one’s liking without expecting something bad to happen. :wink:

1 Like

I don’t understand what you mean to say that the monitor doesn’t care. The only thing the monitor does is applying a gamma of 2.2 right?

I think this is where we differ: you keep saying if it looks fine to me than don’t change a thing. The issue is I don’t know if it looks fine, because I have no reference. Is “gamma correction” of 1.0 correct? Am I then replicating the 2.5 gamma of a CRT monitor correctly on my LCD with the shader? I don’t know, that’s why my long explanation in previous post.

I will do just that :wink:

1 Like

What should we use to test gamma after we’ve added shaders and everything? Could a homebrew rom like Test Suite with gamma test patterns be useful? Or can this be measured directly with a calibrator and the screens in Test Suite?

1 Like

@guest.r

Is there a way to disable scanlines completely for the purpose of analyzing masks?

Also, is this currently a pattern or one that can be implemented? I think it will work well even at 1080p; it would be 270 TVL at 1080p and 540 TVL at 4K. It will result in regular subpixel spacing and eliminate chromatic aberration which should result in better phosphor definition. Sorry for being lazy and not checking the masks myself!

image

2 Likes

To get a ‘middle’ conversion, regarding the difference between 2.5 and 2.2, then with input gamma of 2.4 output gamma of 2.112 should be used.

If you use a neutral, i.e. 2.4/2.4 or 2.2/2.2 gamma combination, then it really depends from how the original input content is encoded. 2.4 divided by 2.4 gives the result of 1.0 and what happens afterwards assumes the “relative” 1.0, since we are not using sRGB as the last pass.

You can lower the interlace trigger resolution and then best select interlace mode 3.

It can be done, np.

2 Likes

It’s mask 6 in my subpixel mask function (dunno what happened to the comment there :P): https://github.com/libretro/slang-shaders/blob/master/include/subpixel_masks.h#L94

2 Likes

I’ve been testing a bit and something in the range of 2.4/2.27 seems rather nice for my monitor. It roughly translates to emulated CRT gamma of 2.33 for my setup then. (Still close to the theoretical range of vintage CRTs 2.35 - 2.55 so I can live with that :wink: ) Going much lower darkens the image slightly too much for my taste. I think for newer systems like PS1 I’m more content with slightly higher value for gamma out, going more to neutral. Well back to playing games now :hugs:

In the arcade game Outrun, in the clouds, if I adjust any of the default color settings it blows out all the details. (Setting a color temp of -100 for example). I’m not sure if that is expected behavior.

1 Like

Can you confirm this as a regression compared with the previous version? Color temperature is also in my to-do list, but will ‘prio’ it a bit now.

You can use a calibrator yes, but if you lack one and assume its calibrated to 2.2 or 2.4 you can test the output of a greyscale ramp from retroarch in DaVinci Resolve (free) with the Parade Scope, or even Avisynth (lighter). Compare it with a gamma ramp of your like. It should measure grade’s CRT gamma given that crt-guest-venom or following shaders don’t alter it.

1 Like

Looks good here…? Color temp 50%.

I’ve noticed that beam shape and scanline parameters have a big effect on this.

shaders = "7"
shader0 = "shaders_slang/misc/grade.slang"
filter_linear0 = "false"
wrap_mode0 = "clamp_to_border"
mipmap_input0 = "false"
alias0 = "StockPass"
float_framebuffer0 = "false"
srgb_framebuffer0 = "false"
scale_type_x0 = "source"
scale_x0 = "1.000000"
scale_type_y0 = "source"
scale_y0 = "1.000000"
shader1 = "shaders_slang/crt/shaders/guest/crt-gdv-new/afterglow0.slang"
filter_linear1 = "false"
wrap_mode1 = "clamp_to_border"
mipmap_input1 = "false"
alias1 = "AfterglowPass"
float_framebuffer1 = "false"
srgb_framebuffer1 = "false"
scale_type_x1 = "source"
scale_x1 = "1.000000"
scale_type_y1 = "source"
scale_y1 = "1.000000"
shader2 = "shaders_slang/crt/shaders/guest/crt-gdv-new/pre-shaders-afterglow.slang"
filter_linear2 = "false"
wrap_mode2 = "clamp_to_border"
mipmap_input2 = "false"
alias2 = "LinearizePass"
float_framebuffer2 = "true"
srgb_framebuffer2 = "false"
scale_type_x2 = "source"
scale_x2 = "1.000000"
scale_type_y2 = "source"
scale_y2 = "1.000000"
shader3 = "shaders_slang/crt/shaders/guest/crt-gdv-new/avg-lum.slang"
filter_linear3 = "true"
wrap_mode3 = "clamp_to_border"
mipmap_input3 = "true"
alias3 = "AvgLumPass"
float_framebuffer3 = "false"
srgb_framebuffer3 = "false"
scale_type_x3 = "source"
scale_x3 = "1.000000"
scale_type_y3 = "source"
scale_y3 = "1.000000"
shader4 = "shaders_slang/crt/shaders/guest/crt-gdv-new/blur_horiz2.slang"
filter_linear4 = "true"
wrap_mode4 = "clamp_to_border"
mipmap_input4 = "false"
alias4 = ""
float_framebuffer4 = "true"
srgb_framebuffer4 = "false"
scale_type_x4 = "absolute"
scale_x4 = "800"
scale_type_y4 = "source"
scale_y4 = "1.000000"
shader5 = "shaders_slang/crt/shaders/guest/crt-gdv-new/blur_vert2.slang"
filter_linear5 = "true"
wrap_mode5 = "clamp_to_border"
mipmap_input5 = "false"
alias5 = "GlowPass"
float_framebuffer5 = "true"
srgb_framebuffer5 = "false"
scale_type_x5 = "absolute"
scale_x5 = "800"
scale_type_y5 = "absolute"
scale_y5 = "600"
shader6 = "shaders_slang/crt/shaders/guest/crt-gdv-new/crt-guest-dr-venom2.slang"
filter_linear6 = "true"
wrap_mode6 = "clamp_to_border"
mipmap_input6 = "false"
alias6 = ""
float_framebuffer6 = "false"
srgb_framebuffer6 = "false"
scale_type_x6 = "viewport"
scale_x6 = "1.000000"
scale_type_y6 = "viewport"
scale_y6 = "1.000000"
parameters = "g_gamma_in;g_signal_type;g_gamma_type;g_crtgamut;g_space_out;g_hue_degrees;g_I_SHIFT;g_Q_SHIFT;g_I_MUL;g_Q_MUL;g_lum_fix;g_vignette;g_vstr;g_vpower;g_lum;g_cntrst;g_mid;wp_temperature;g_sat;g_vibr;g_satr;g_satg;g_satb;g_lift;blr;blg;blb;wlr;wlg;wlb;rg;rb;gr;gb;br;bg;LUT_Size1;LUT1_toggle;LUT_Size2;LUT2_toggle;PR;PG;PB;AS;sat;TNTC;CP;CS;WP;wp_saturation;GAMMA_INPUT;lsmooth;SIZEH;GLOW_FALLOFF_H;SIZEV;GLOW_FALLOFF_V;glow;bloom;TATE;IOS;OS;BLOOM;gamma_c;brightboost;brightboost1;gsl;scanline1;scanline2;beam_min;beam_max;beam_size;vertmask;scans;spike;h_sharp;s_sharp;csize;bsize;warpX;warpY;shadowMask;masksize;maskDark;maskLight;CGWG;mcut;slotmask;slotwidth;double_slot;slotms;mclip;inter;interm;gamma_out"
g_gamma_in = "2.400000"
g_signal_type = "1.000000"
g_gamma_type = "1.000000"
g_crtgamut = "2.000000"
g_space_out = "0.000000"
g_hue_degrees = "0.000000"
g_I_SHIFT = "0.000000"
g_Q_SHIFT = "0.000000"
g_I_MUL = "1.000000"
g_Q_MUL = "1.000000"
g_lum_fix = "0.000000"
g_vignette = "0.000000"
g_vstr = "40.000000"
g_vpower = "0.200000"
g_lum = "0.000000"
g_cntrst = "0.000000"
g_mid = "0.500000"
wp_temperature = "8005.000000"
g_sat = "0.000000"
g_vibr = "0.000000"
g_satr = "0.000000"
g_satg = "0.000000"
g_satb = "0.000000"
g_lift = "0.000000"
blr = "0.000000"
blg = "0.000000"
blb = "0.000000"
wlr = "1.000000"
wlg = "1.000000"
wlb = "1.000000"
rg = "0.000000"
rb = "0.000000"
gr = "0.000000"
gb = "0.000000"
br = "0.000000"
bg = "0.000000"
LUT_Size1 = "16.000000"
LUT1_toggle = "0.000000"
LUT_Size2 = "64.000000"
LUT2_toggle = "0.000000"
PR = "0.140000"
PG = "0.140000"
PB = "0.140000"
AS = "0.080000"
sat = "0.100000"
TNTC = "0.000000"
CP = "0.000000"
CS = "0.000000"
WP = "0.000000"
wp_saturation = "1.000000"
GAMMA_INPUT = "2.400000"
lsmooth = "0.900000"
SIZEH = "4.000000"
GLOW_FALLOFF_H = "0.300000"
SIZEV = "4.000000"
GLOW_FALLOFF_V = "0.300000"
glow = "0.000000"
bloom = "0.000000"
TATE = "0.000000"
IOS = "0.000000"
OS = "1.000000"
BLOOM = "0.000000"
gamma_c = "1.000000"
brightboost = "1.000000"
brightboost1 = "1.500000"
gsl = "0.000000"
scanline1 = "10.000000"
scanline2 = "20.000000"
beam_min = "1.999999"
beam_max = "1.000000"
beam_size = "1.000000"
vertmask = "0.000000"
scans = "0.600000"
spike = "1.000000"
h_sharp = "3.000001"
s_sharp = "1.000000"
csize = "0.000000"
bsize = "600.000000"
warpX = "0.000000"
warpY = "0.000000"
shadowMask = "8.000000"
masksize = "1.000000"
maskDark = "0.500000"
maskLight = "1.000000"
CGWG = "0.500000"
mcut = "1.150000"
slotmask = "0.000000"
slotwidth = "2.000000"
double_slot = "1.000000"
slotms = "1.000000"
mclip = "0.500000"
inter = "350.000000"
interm = "1.000000"
gamma_out = "2.400000"
textures = "SamplerLUT1;SamplerLUT2;SamplerLUT3"
SamplerLUT1 = "shaders_slang/crt/shaders/guest/lut/sony_trinitron1.png"
SamplerLUT1_linear = "true"
SamplerLUT1_wrap_mode = "clamp_to_border"
SamplerLUT1_mipmap = "false"
SamplerLUT2 = "shaders_slang/crt/shaders/guest/lut/sony_trinitron2.png"
SamplerLUT2_linear = "true"
SamplerLUT2_wrap_mode = "clamp_to_border"
SamplerLUT2_mipmap = "false"
SamplerLUT3 = "shaders_slang/crt/shaders/guest/lut/other1.png"
SamplerLUT3_linear = "true"
SamplerLUT3_wrap_mode = "clamp_to_border"
SamplerLUT3_mipmap = "false"

After some playing around the problem is the grade pass’s white point. If you set the white point to something like 8000 the color settings of venom wash out details in the brights. It probably doesn’t make sense to adjust them if you are using grade anyways.

1 Like