If the output of CRT gamma was 2.50 then you want an absolute measured output of gamma 2.50 on your LCD after all things considered (even assuming your display is calibrated to 2.2). That means that digital linearization and gamma encoding should be cancelled each other as I did while matching your calibrated display gamma.
For the analogue color work I use the SMPTE-C transfer function which is similar to the sRGB one but with a different threshold for the linear part. When you use the moncurve functions in grade the disparity in gamma doesn’t create a discontinuity.
I don’t know about scanlines and its requisites but if it keeps gamma unaltered/cancelled (other than scanlines related gamma) it should honor grade’s gamma.
RetroArch doesn’t really do anything with gamma, for better or for worse. It really only comes into play as far as the framebuffers are concerned, otherwise everything is ~2.2.
And if you overscale or underscale the shader last pass? Or the rotations with arcade games/cores? You could think about how can the ‘fitting image’ happen, by magic or by applying an additional buffer? Nevertheless, the monitor really doesn’t care about internal gamma processing.
Emulated cores deliver the frames if, and i mentioned this before, the non-shaded output looks fine to you color-wise, then my crt shader shouldn’t do corrections. If the input frames are off, then a simple correcture should do it on the pre-shader level. Simple as that. There is really no need for every shader to include the variety of corrections, which consider a large set of different crt displays and emulated systems.
It’s not relevant, because the default input color space for shaders is relative gamma 1.0, and the default output color space for my shader is also 1.0. It can do a lot of stuff, even different gamma internal processing and a sort of correction, but in this case the layers after the shader plus monitor will consider the relative gamma 1.0 output.
I can improve my gamma function, but it will still be a gamma function with possible gamma combinations. I think it’s common sense to use the available shader parameters and tweak to one’s liking without expecting something bad to happen.
I don’t understand what you mean to say that the monitor doesn’t care. The only thing the monitor does is applying a gamma of 2.2 right?
I think this is where we differ: you keep saying if it looks fine to me than don’t change a thing. The issue is I don’t know if it looks fine, because I have no reference. Is “gamma correction” of 1.0 correct? Am I then replicating the 2.5 gamma of a CRT monitor correctly on my LCD with the shader? I don’t know, that’s why my long explanation in previous post.
What should we use to test gamma after we’ve added shaders and everything? Could a homebrew rom like Test Suite with gamma test patterns be useful? Or can this be measured directly with a calibrator and the screens in Test Suite?
Is there a way to disable scanlines completely for the purpose of analyzing masks?
Also, is this currently a pattern or one that can be implemented? I think it will work well even at 1080p; it would be 270 TVL at 1080p and 540 TVL at 4K. It will result in regular subpixel spacing and eliminate chromatic aberration which should result in better phosphor definition. Sorry for being lazy and not checking the masks myself!
To get a ‘middle’ conversion, regarding the difference between 2.5 and 2.2, then with input gamma of 2.4 output gamma of 2.112 should be used.
If you use a neutral, i.e. 2.4/2.4 or 2.2/2.2 gamma combination, then it really depends from how the original input content is encoded. 2.4 divided by 2.4 gives the result of 1.0 and what happens afterwards assumes the “relative” 1.0, since we are not using sRGB as the last pass.
You can lower the interlace trigger resolution and then best select interlace mode 3.
I’ve been testing a bit and something in the range of 2.4/2.27 seems rather nice for my monitor. It roughly translates to emulated CRT gamma of 2.33 for my setup then. (Still close to the theoretical range of vintage CRTs 2.35 - 2.55 so I can live with that ) Going much lower darkens the image slightly too much for my taste. I think for newer systems like PS1 I’m more content with slightly higher value for gamma out, going more to neutral. Well back to playing games now
In the arcade game Outrun, in the clouds, if I adjust any of the default color settings it blows out all the details. (Setting a color temp of -100 for example). I’m not sure if that is expected behavior.
You can use a calibrator yes, but if you lack one and assume its calibrated to 2.2 or 2.4 you can test the output of a greyscale ramp from retroarch in DaVinci Resolve (free) with the Parade Scope, or even Avisynth (lighter). Compare it with a gamma ramp of your like. It should measure grade’s CRT gamma given that crt-guest-venom or following shaders don’t alter it.
After some playing around the problem is the grade pass’s white point. If you set the white point to something like 8000 the color settings of venom wash out details in the brights. It probably doesn’t make sense to adjust them if you are using grade anyways.
Are you sure this doesn’t have something to do with the scanline/beam shape parameters? Check out the above shot, I don’t see any clipped detail in-game. I have noticed that it can be hard to balance things with all the automatic corrections that occur, takes a bit of trial and error.
@Nesguy I can’t figure out what it is. Baseline Venom2 I can make adjustments fine. When I push things to try and maximize thick to think scanlines are having strange things happen. Maybe it’s a combo of things I’m pushing to far. Check these settings out. Tweaking the color temp on this shader it blows out the outrun clouds. Also I’m getting rainbowing. These are not good settings but perhaps help with illuminate possible problems/conflicts. (With the shader or me )
Yeah I know what you mean; it’s hard to get a grip on things with what’s happening under the hood and it feels like there’s somewhat less freedom to adjust things without breaking the image in some way. I would try starting with more neutral settings and then gradually increasing them. With my settings above you should have room to play around with the beam shape low/high parameters.
I’ve compared both and haven’t seen too much a difference and sometimes I switch to a sub 4k monitor. I still get great results from Venom 2, I have just noticed a few quirks.
I’m trying the newest version of the dr.venom2 shader out and I’m getting some really bad ghosting on NES Mesen core. Most visible on Level 1-2 of mario 1, which I’m posting a short clip of.
Quality isn’t the best, and I’m trying to play with only one hand on the controller.