New CRT shader from Guest + CRT Guest Advanced updates

All good now and no problem, it happens lol

Thank you! But are we not better off just using the “Bloom Strenght” parameter instead of “Mask Bloom” when using slotmasks? What’s the difference between the two bloom settings.

I’m sorry if this was already answered previously. :stuck_out_tongue:

I tried both RGB and BGR layouts, but the result was very similar. I also noticed weird tints on the greyscale when I was using CRT Royale. Maybe it’s got to do with the subpixel structure of my TV.

But RGB masks, like mask 12, look really good and neutral on the greyscale, so I’m happy!

Don’t take my settings as gospel. I’m still learning how this shader works myself. :stuck_out_tongue:

I hope more slotmask lovers with 4K TVs show us their presets.

2 Likes

It’s good enough for now at least lol

1 Like

I’ve got a few 4K Optimized Slot Mask presets in my shader preset pack.

1 Like

@Cyber Thanks. I’ll take a look!

I want to see what mask and slotmask related parameters you use.

1 Like

Maybe not in a Mask Bloom demonstration… :grin: Ordinary bloom tends to wash out ntsc artifacting a bit and produces a different mask reduction distribution. I never felt too comfortable using ordinary bloom with some Mario games, for example.

Mask bloom also doesn’t clip colors when cranked up, better preserves horizontal transitions because colors aren’t pushed up, works nice with mask gamma and has different mask presence in darker scanline parts.

But ordinary bloom is, as you mentioned it, very nice with slotmask setups and normal mask setups when using lesser bloom strength.

Both blooms can also be combined though.

On a RGB panel using the shader’s BGR layout there should be more consistent subpixel spacing, removing green or magenta vertical ‘lines’. But it might really be the TV’s subpixel structure.

2 Likes

I tried all the preset files for your shader, but nothing like the one shown in your picture, how do you set it up like the picture?

2 Likes

Try:

  • Scanline type 2.0
  • CRT mask type: 5.0 or 6.0
  • Mask Strength: 0.2
  • Mask Strength low: 1.25

Or copy these parameters into an crt-guest-advanced saved preset:

gsl = "2.000000"
shadowMask = "5.000000"
maskstr = "0.200000"
mcut = "1.250000"

Ofc. you can tweak it a bit.

3 Likes

It still feels a little different from the above picture, especially the scan lines are a bit obvious…1658129985140.jpg

2 Likes

You can tweak the shader a bit more to your liking. It has plenty of parameters to correct the appearance. Although it’s sometimes quite hard to compare the settings on different images.

Screenie:

Settings:

gsl = "2.000000"
scans = "1.100000"
h_sharp = "4.500000"
s_sharp = "1.000000"
shadowMask = "6.000000"
maskstr = "0.100000"
mcut = "1.250000"
3 Likes

That’s a relatively old screenshot from a relatively old preset. The characteristics of some of the parameters might have changed/evolved over time since it was made.

In order to get that look, you might need to enable integer scaling as well as turn down most if not all parameters which add any blur, glow, halation, horizontal or vertical filtering.

Maybe @Nesguy can assist since I think he might have originally made this preset. Why not head over to one of his threads and download one of his newer ones?

3 Likes

It’s an adapted version of crt-guest-dr-venom, where i was testing the new mask controls.

Mask distribution over edges and saturation caused by scanline code got improved in the recent versions, so it’s maybe a good idea also to try crt-guest-dr-venom or crt-guest-sm for the ‘exact look’, but the advanced version can get pretty close, probably with better handling of some situations.

Glow is also to be tuned down etc.

1 Like

Thanks for the explanation.

Yeah, I think I’ll be using the ordinary bloom for now, as the mask bloom seems to need a bit more customization and I’m happy with the look I got.

Here it is:

I’ve used these parameters with the NTSC preset:

quality = "0.000000"
ntsc_sharp = "-8.000000"
interm = "3.000000"
bloom = "1.000000"
gamma_c = "1.200000"
brightboost = "1.000000"
brightboost1 = "1.000000"
shadowMask = "12.000000"
maskstr = "1.000000"
slotmask = "1.000000"
slotmask1 = "1.000000"
slotwidth = "7.000000"
double_slot = "3.000000"
mclip = "0.000000"
gamma_out = "2.200000"

“quality” is set for S-Video because I don’t want Composite artifacts and S-Video still blends the Sonic waterfall and other colors on the Mega Drive/Genesis.

I’ve set the interlacing mode to 3 as it’s the mode I found works for interlacing on all my cores.

I then set bloom to 1.0 and mask clip to 0.0, but I was having trouble adjusting the gamma correction, but after I set “gamma_out” to 2.2 instead of the default, I could much more easily use the gamma correction parameter to fine tune the gamma. Shouldn’t “gamma_out” be set to 2.2 as default, as that’s the closest to the sRGB standard?

I also set both brightboost values to 1.0 (I assume neutral?), as I saw you doing the same on your slotmask preset previously and I used the gamma correction instead to fix the gamma.

I am also very curious about how does the NTSC adaptive sharpening works. What kind of magic makes it possible to sharpen the picture while keeping dedithering intact? I also found no visible sharpening artifacts. It’s honestly amazing.

2 Likes

If gamma_out is lower compared with gamma_in, then the image gets an aditional portion of saturation applied. Some shader mechanic already work towards increased saturation, so it would produce an additional excess.

As i already stated, shader gamma has nothing to do directly with ‘lcd gamma’ and i like to keep it neutral. It’s mostly about internal processings like horizontal filtering and vertical scanline blending, glow and bloom distribution etc. The image which is sent to the display shouldn’t have any full-scale gamma treatments compared with the original image. Original image is the reference and is displayed correctly without shaders or without shaders with gamma functionalities. It’s not this hard to understand i guess.

Sometimes i like to increase it even more including bloom setups. It’s a matter of balancing the image and working towards a result in the end.

Gamma correction is a nice feature and it doesn’t change saturation. I absolutely prefer it like this. It’s to be used to correct brightness with stronger masks and also scanlines.

Thanks for the acknowledgement. :smiley: It requires a couple or two layers of logic to produce the results and lots of testing, but i’m also very happy with the results. A screenshot of a crt display in the other thread showed some sharpening shadows with can be achived with the main filtering or sharpen pass. It’s some kind of proof crt displays dealt with the fuzziness issue.

3 Likes

I think the technique is called Raw Awesome Guest.r Wizardry :star_struck:

6 Likes

Thanks for the thorough explanation! :slight_smile: Even though a lot of it is over my head, haha!

But isn’t the default gamma_out lower than default gamma_in already?

What would be the “neutral” value to you?

But wasn’t CRT’s gamma around 2.4~2.5, in contrast to the 2.2 gamma of current displays? Shouldn’t we always convert it to 2.2 in the case of sRGB displays?

Got it! I’ll keep to it for gamma adjustments.

So the NTSC adaptive sharpness doesn’t use any pattern detection like mdapt or gdapt does? I never liked mdapt or gdapt much because I don’t want false-positives.

Sorry for all the questions! I’m trying to understand a bit more how your shader works. :slight_smile:

1 Like

No, it’s 2.4/2.4 with this shader.

You must consider both values. If the divider is 1.0 (2.4 / 2.4 = 1.0), then it’s neutral.

We don’t convert in general, just do interpolation in linear space (2.4). Like i mentioned, RA works appropriate without shaders which would do gamma.

The NTSC version uses a gamma combo of 2.0/1.95 and the HD version a combo of 1.80/1.75. (Now questions will start, lol).

If we used like 2.4/2.2 then a gamma surplus would be created, assuming LCD gamma of 2.4. This is a bit harder to understand. LCD displays do their own thing.

For example, you can go with 2.4 / 2.4 or 2.2 / 2.2 or 2.0 / 2.0 combos. The end result would look very similar on a LCD. The quotient is 1.0 in all cases.

But but if you use 2.4 / 2.2, 2.2 / 2.2 or 2.0 / 2.2 the results would look differently, although output gamma is always 2.2. The quotient in the first case is ~1.1, 1.0 in the second case, ~0.9 in the third.

~1.1 assumes/compensates for LCD gamma of 2.4, 1.0 LCD gamma of 2.2, and 0.9 LCD gamma of ~2.0.

The only reason why one would use 2.4/2.2 combo is to increase contrast and saturation. It’s sometimes beneficial if the shader is coded in a specific manner.

It uses a kind of logic, but not pattern detection. The application of the logic is ‘continuous’ and not discrete, that’s why it looks like without detection artifacts.

2 Likes

Ah! This makes sense now! I’ve always used the NTSC preset, so I did not know the base shader uses 2.4/2.4 (and I read your post before you edited it). And you’re right, I don’t understand why the NTSC preset is 2.0/1.95, but I’m sure there’s a good reason for it! :smiley:

So, in the end, the recommended way to adjust the gamma is on the “Gamma correct” parameter?

2 Likes

@guest.r IIRC HDR uses BT2020 which uses 2.4 gamma. I’ve made changes to gamma for my presets that are 2.2 content and 2.2 CRT. Is there a more elegant way to do this? It’s sometimes difficult to make sense of some of the settings. Thanks again for all of the recent improvements.

Maybe phrasing these settings as, display gamma and content gamma might make it make more sense – to me at least. To be honest, I don’t know what the correct settings would be. Sorry if what I said made no sense. :sweat_smile:

3 Likes