lolol shows what I know.
Hey @Nesguy , do we have this preset on MegaBezel Shader Pack? I liked this shot. Please, can you post preset code?
Hi! All my presets can be found here. There may have been a couple small tweaks since that shot was taken, for the better hopefully.
I’m wondering if there’s any possibility for a shader geared exclusively to 100% strength masks? GDV works very well with the two pixel masks at 100% strength on an SDR monitor, it’s when you try to do the same with a three pixel mask that you have to resort to extreme measures, and it feels like an uphill battle. Have you considered this? Then again, I may just be running into the limitations of SDR… with a 3 pixel mask (at 100%) you need 3x the base brightness to achieve the same luminosity per pixel. 
I have indeed considered some faux tricks, but doesn’t look to consistent. The main problem with masks is that consistency is of very great importance for an appealing look. Currently i doubt there are possible solutions available for a RGB or RGBX mask at full strength to look brighter as the display allows. My display can produce a medium level of brightness (400+ nits) in SDR mode and i still like mask mitigation techniques best, maybe only a staggered 2-size masks looks good enough to me at full strength.
Something about 100% mask that bothers me is how they look on bright pixels.
See this comparison:
Left: shader with 100% mask
Right: screenshot taken from a Trinitron
The 100% mask looks perfect on dark pixels. Compare the backgrounds.
But, on bright pixels, they are completely different. There be some way to fix this. I didn’t find any shaders which can overcome this.
I think that’s a side effect of the camera’s dynamic range rather than an actual behavior. That is, I believe it’s just bloom/flare-out in the photo.
Well, maybe the crt brightness is leaking over the mask, if that’s possible.
You could probably mimic the effect by modifying the mask strength by the inverse of the luminance, but this monkeys with the black level in a way that I don’t love:
OTOH, this also happens with luma-linked beam width, but you and guest have handled that admirably, so maybe this isn’t that big of a deal 
We need to make sure we’re comparing apples to apples, I think.
For the comparison to be useful, we need a photo of the aperture grille CRT with the individual phosphors in-focus (ie., able to see the RGB strips). Same thing with the shader, it should be a photo of the actual LCD screen (preferably a 4K one), with the emulated phosphors in-focus. Check out the Megatron shader thread for some excellent side-by-side comparisons.
Crt displays imo had different chromatics/lighting properties. A good example is that you need at least 2x more nits with a flat display combined with a full mask strengts shader to properly ‘emulate’ a 1x nits value crt tv display. The images above might be taken with about the same lighting strength.
are you saying you need (for example) 200 nits + full mask strength to equal a 100 nit CRT? That sounds right for the 2px masks, which result in a 50% reduction in brightness. For the 3px masks I think you need 3x nits, and for slotmask it’s even worse.
I started experimenting with -50 mask value with crt-guest-advanced and I am happy with the results. Try and see if this is something that interests you. While using HDR it looks super bright at 200 paper white (800 peak white)
I wrote/meant you need at least a ‘2x nits’ modern display, but you are quite right about the increasing necessarity for brightness.
I’m not sure about these nit comparisons as a CRT is a pulse display and a LCD is a sample and hold. Therefore one has a very intense pulse of nits then a steep fall off and the other is constantly spitting out a stream of photons at a relatively steady rate. They’re very different graphs.
Sure you can average out the pulse display but I’m not sure our brains interpret it like that. As in I think that intense pulse is being interpreted as many more nits in total than it actually is spread out over a longer period of time - with the peak lower.
Thats in the temporal dimension alone then take into account the spatial dimension differences.
From my limited testing with a HDR600 monitor it gets reasonably close in brightness to a 2730 PVM. Certainly not quite as bright but I think a modern QD-OLED or LCD will surpass it with 100% mask and a 4px mask. Whether the colorimeters agrees with that over the whole screen is a different matter.
What it won’t do is mimic the pulse though - which is really the next step and is started to being taken with backlight strobing as I’ve said (bored you with
) elsewhere.
Something is fundamentally different from the way the light comes off a real shadow mask and how it comes off of these mask shaders. I have always ended up turning masks off because it looks too artificial. The effect isn’t convincing.
Maybe HDR is the solution, but even at high brightness something looks off about the mask effects. It looks ‘pixelly’ I guess.
I once tried preblending the mask colors but it didn’t good. Instead I wonder if we take existing masks and substitute green with white, if that would have a better effect. I can’t remember if there are subpixel masks that do that.
A fundamental problem of masks and scanlines are currently panel response times, combined with sustained brightness levels. Sometimes one must invest into a proper display, best bets are OLED, second best bet is a 180Hz or 240Hz display for sub-frame effects. I’m still a bit cautious about OLEDs and very high time count of masked content, it needs some shifting mechanisms over time to avoid burn-in. But OLEDs otherwise can offer the best crt emulation experience.
Add a notch of mask mitigation and a proper viewing distance, then the experience should be very nice.
You can belive me, that the normalized sum of mask weights over the entire mask effect width (i.e. 3 for RGB) should be (1.0, 1.0, 1.0) or very close to it, or there will be notable tinting.
This is a pretty old thread.
Yes, you need around 250 nits on full white with shaders applied for it to feel like a CRT, in my experience.
For a while, 100-200 nits with shaders was deemed “adequate,” but my recent experiments with HDR indicate that we really need more than this. 100 nits on LCD isn’t the same as 100 nits on a CRT, the CRT phosphor gets to like 10,000 nits for a few nanoseconds.
I think the solution to this is just more brightness. The Phosphors are still Red, Green and Blue in the CRT screenshot. If the camera settings were not overexposing things, we might see the Phosphors appearing as Red, Green and Blue.
Similarly, we can get that same overexposed effect with the LCD/Shader combination if we overexposed the shot via the camera settings.
This is something easy to experiment with and try out for yourself.

