Mega Bezel Reflection Shader! - Feedback and Updates

Ngl, I usually agree with your takes, but moire is definitely a thing that happens on actual CRTs. (It just varies on how prominent it actually is; aka how much you can actually notice it.)

I will say the less prominent the scanline visibility is on a CRT the less likely it is to present visible moire.

4 Likes

The moire on a CRT is due to completely different factors and looks different, though. It’s also usually not nearly as noticeable.

3 Likes

It can, it can also show up on film exactly like we deal with it in shaders iirc.

I’m just saying it’s definitely a thing that happens, to say it doesn’t is wrong.

And honestly, I’m pretty sure I’ve screamed to the rooftops what is causing the moire we tend to experience in shaders, repeatedly over multiple different threads. (Do I have a solution, nope.)

4 Likes

I would add that it’s not limited to curved presets either. Anywhere you have multiple fine high contrast grids or sets of lines or fine patterns close together to resolve the illusion can occur. Viewing distance also plays a role in the appearance of Moire. It tends to be more noticeable for me beyond a certain distance from the screen.

Some of the patterns which are competing for “space” include our scanlines, masks including faux grill wires as well as the black space and general pattern our LCD subpixel layouts and pitch create.

It’s probably my Achilles heel when it comes to trying to get certain presets looking right when scaled down to fit certain overlays. Lots of compromises in scanline gamma had to take place.

3 Likes

Basically a fancy way of saying what I’ve been saying.

High contrast patterns cause optical illusions, brain and eyes go brrrrr.

Honestly at this point I’m starting to believe CRTs relied solely on optical illusion black magic to function correctly :joy:.

3 Likes

I think that this is the only actual solution which always works, but I agree I don’t think it is a good solution because it necessarily destroys the mask.

My understanding of the issue is it is caused by two things, a discrete maximum sampling resolution (your lcd resolution) and patterns which you are trying to sample which are higher frequency than this resolution. I think this equates to sampling a pattern whose frequency is past the nyquist limit of your lcd resolution.

So I think the higher frequency pattern is created when the curvature and scanlines is combined with the high contrast mask.

Here’s an article which I think explains it reasonably well.

https://www.imatest.com/docs/nyquist-aliasing/

3 Likes

Thanks a lot for this. I haven’t read the article yet but it would be nice if there was an easy way to predict when our settings will be running up against those limits that’s better than the current trial and error.

1 Like

Yeah, I’m not sure this is possible.

This is because the high frequency pattern is emergent in nature, and only exists in the final image when the different elements have all been added together.

3 Likes

Gracias! At least now we know why and that there is a hard physical limit to these things and not just strangeness on the software side.

So the real solution might lie in having consideration of these limits when designing and tweaking that particular CRT look that one desires.

2 Likes

Yes this is exactly it - in order to mitigate moire effects you need to render at a higher resolution and downsample in an intelligent manner i.e anti-alias. This though is pretty much what we’re doing in these shaders (in a certain respect): getting a 240p image generating a higher resolution image and then downsampling (continuous/analogue) phosphors into pixels.

Of course when we go stupid on curvature you’re likely to get moire effects but at the curvature you see in CRT I’m not so sure you should. However as with most things in life a lot of things surprise you.

3 Likes

I think why we see more of it in the shaders is that there is no antialiasing on the masks (not that we want that) because the masks are exactly registered to the output pixels.

I think that rendering the scanlines as a separate pass at higher resolution then downsampling to final resolution to add some antialiasing might help at least reduce any moire coming from scanlines. Then after this applying the mask.

2 Likes

What I’m planning to do is just treat the scanline as being infinitely small, project it onto the curved surface and then just build the scanline at each phosphor as you normally would do but adding a bit of scaling for perspective. In my head this should work and would hopefully mitigate moire effects but the proof is in the pudding - this could be what all these shaders do already. (It could also be too faux looking)

I’d do what youre proposing in VR as potentially we have a lot higher resolution in VR ie the foveated area can be very small compared to the screen. But no I’m not proposing we all move to VR to play retro games. :joy: Or am I - the more I think about it… :rofl:

3 Likes

Sounds like a cool experiment :star_struck:! I’m looking forward to seeing how it goes!

Most of the shaders do like the Mega Bezel is currently, input the curved coordinate into the scanline calculation, then apply the mask over top in directly registered to the pixels in the final resolution.

Royale does something different than this, it resamples the mask scaled to the desired triad size, and renders all this at the viewport resolution and applies it on top of the scanlines, then has another pass to project this into the curved projection space with some antialiasing as it does this which reduces some artifacts. This is why Royale looks MUCH better at 4k than it does at 1080p where it looks more muddy.

If you’re interested in the Royale curvature, I separated it out into its own file in the Mega Bezel but if you have some experience with doing 3D projections you will probably be able to write something much simpler and get as good a projection, as it has a very heavy performance cost.

5 Likes

The one thing about what I’m going to try is that it’ll only really work for curvature around the y axis as my Sony PVMs have.

Obviously there’s a lot of TVs that are bulbous and I won’t be able to do very much with them without a lot of super-sampling or starting to use temporal techniques. Mind you thinking about this it’s all static so there’s the possibility of loading in a precalculated high resolution coverage mask.

3 Likes

Not so much a question about Royale, but in general - how does this consideration play out if you’re rendering at 1080 (ie: my potato box) but outputting to a TV that is 4k (while both Windows and RA are at 1080) ?

Would that give the muddier 1080 look or the better 4k look?

It would give the muddier look… you will just have a better view.:grin:

Usually when you do this the TV is doing a bilinear upscale which will blur the pixels together, so you will actually get a blurrier picture than a 1080p image on a 1080p monitor/tv. If you can get the TV or your Video card to use nearest neighbor / integer scaling which would result in as sharp an image as the 1080p panel.

Generally if you don’t have the integer scaling in the video card or the tv then for the best picture I wouldn’t recommend outputting at a different resolution than the actual tv resolution because it creates a muddier picture and can do weird things with the mask. If you don’t have a strong mask the main issue would be that it’s blurrier.

1 Like

I did some testing at 4K of the curvature currently implemented in the Mega Bezel

crt-sony-megatron-sony-pvm-2730-sdr (I think :wink:

They seem to do pretty well at 4K, I’m not really seeing any moire, even on the Atomiswave one (crt-sony-megatron-sammy-atomiswave-sdr)

I wish my monitor was brighter though, it’s not quite bright enough, I think it’s HDR-400

EDIT:

I also took a close shot of my lcd screen, it’s fun to see the “phosphors” which blend together at a human viewing distance

To view any of the images properly, do open image in new tab, then you can zoom in properly

4 Likes

I figured, lol. Good to know haha

Agreed not ideal to output at the lower resolution because then you’re entrusting the tv or the (already stressed) video card to handle the upscaling.

On the other hand, this is the box I have and it would still be fun to use it on a 65" display (at least some of the time) so I will probably just hunt and peck through the shader repos to find the best looking shader for each core. On some level, in the sense that all of the shaders are about altering the image in some manner, maybe there will be some happy accident where something just looks good even if it isn’t how the shader author meant it to look.

2 Likes