Mega Bezel Reflection Shader! - Feedback and Updates

Up till now I’ve been testing out my RA setup (with the Mega Bezel) on a 1080p LCD TV. I moved the box to a different TV which is 4K and ended up with some interesting observations.

  1. Even when I forced Windows resolution to 1080p, everything ran more slowly with Mega Bezel loaded while connected to the 4K tv, even with “game mode” on. I don’t just mean in-game, but also in the RA menus. Less responsiveness overall, lagging input, etc. Windows and RA reported that the TV has a 60hz refresh rate, so that wasn’t the issue. As a result of the slowness, I elected to turn the Mega Bezel off (sadly). I figured that setting Windows resolution to 1080p would basically make the experience identical to when I was connected to the 1080p TV in the other room, but for whatever reason, this didn’t seem to be the case.

  2. My controller had more lag while the box was connected to the 4K tv (with and without the Mega Bezel). Obviously this has nothing to do with the Mega Bezel or any shader. The controller is an SN30 Pro, Bluetooth. Connecting it via USB cable eliminated the lag. What’s strange is that the lag is not present with the other TV. Baffling. Will probably “resolve” this issue by just switching from BT controllers to 2.4ghz USB dongle controllers.

Weird stuff, but hey, that’s an emulation box for ya…

EDIT: did some more testing this morning - changed the core overrides to just use other shaders and an overlay, then moved the box to the 4K tv again. issue #2 is gone. full responsiveness in-game and in-menu without any use of Mega Bezel. I wonder if what really happened was that Mega Bezel attempted to run at 4k resolution last night before I’d made the windows resolution change – resulting in boosting the thermals, then in throttling of the whole system. Maybe that, in turn, caused all the menu slowdown and bluetooth input lag.

I’d still love to use the Mega Bezel, so I’m going to do some more tests later in the week with a different Dp-HDMI adapter which is limited to 1080p/60hz output, which will mean that windows cannot automatically adjust the resolution (beyond 1080p) when swapping between screens.

2 Likes

In your testing try turning on the resolution debug text which is the first parameter in the list, this will print what the final resolution the shader is running at, this is called Viewport Res.

I mention this because I’ve had Retroarch running at a different resolution than I was expecting before – it was running at a different resolution than the desktop resolution – and this is a good way to see if this is happening.

1 Like

I’ll do that next time I have the box hooked up to the 4k tv upstairs. I thought there could have been some sort of resolution-related issue(s) contributing to all of it (including the perceived input lag). If RA and windows are not on the same page, and then the shader is in the mix also, I could see things going sideways.

3 Likes

In progress shots of the Mega Bezel with the Megatron integrated, sdr version. The global graphics brightness is set to 40% to be a better brightness with the full mask that the Megatron uses. These sdr presets are meant to be used on a monitor which has a high-ish brightness and increase the monitor brightness to compensate for the mask.

7 Likes

Thats looking awesome @HyperspaceMadness I can’t wait to get my hands on this to try it out! I think I’ve said before that many SDR laptop screens go very bright even though they don’t have HDR capabilities.

HDR is a weird thing in that respect as my 2018/19 Dell XPS laptop doesn’t have HDR support but does have ‘HDR playback’ support as in you can’t play HDR games or have a HDR Windows desktop but you can playback HDR videos. I’m not quite sure what the difference is but the display gets very bright when plugged in so who cares! :smiley:

3 Likes

I playback HDR video files on my non HDR TV. I just manually created a calibration profile that has everything looking relatively normal.

Remember many displays were more than bright enough long before HDR became a standard and the likes of Dolby Vision, HDR10 and HLG came along.

So you can still achieve quite a bit with SDR displays.

3 Likes

Are regular IPS monitors to be benefited from the SDR preset?

Thank you

2 Likes

Yes it will put the Sony Megatron into the right colour space for normal SDR IPS monitors. You can tweak brightness, contrast, saturation and gamma to your heart’s content after setting the shader into SDR mode.

2 Likes

I don’t know, apart from laptop displays (that the manufacturers thought might be used outside) I’m not so sure there were many displays outside of professional monitors that got that bright. Generally the very bright SDR displays (bright enough for proper masks) are HDR capable - HDR became ‘mainstream’ around 2015.

3 Likes

Yeah, looking forward to seeing what you think :grinning:. My Github is updated (link in first post) if you want to grab it from there, though I will release it as a package pretty soon.

Thanks for your contribution to the community & retrogaming with this great shader!

4 Likes

Oh wow! I will as soon as I have a chance to get on my PC although that’s probably going to be Tuesday or later. What repository is it in btw?

2 Likes

I’ll be up front they don’t look very much like an actual CRT when you add that glass smooth BUT I am glad you’ve found something you like and I do think they look amazing in a stylized CRT way. I’ll have to have a go and see what they look like in person - I’ve got a very dull CRT that this may brighten up the image for.

EDIT: Maybe what I’m looking at is upscaling/downscaling rather than the glass smooth as you’re grabbing it off a 1200p monitor.

It’s this one:

You need to put the Mega_Bezel folder/repo in the shaders_slang/bezel folder

4 Likes

One thing to note about the integration with the Mega Bezel, all the normal curvature methods are available, but by default the crt curvature scale multiplier (HSM_CRT_CURVATURE_SCALE) is set to 0 to remove curvature to avoid Moire.

Set this to 100 to get the curvature which matches the main curvature. Be warned though, if you are at 1080p this will cause some ugly Moire – that’s why it’s set to 0 by default – but I think at 4K it will probably be ok

HSM_CRT_CURVATURE_SCALE = 0

HSM_CRT_CURVATURE_SCALE = 100

5 Likes

Yes so obviously I cant stop anybody adding curvature or blurs, blooms, smooths or anything else they want (and nor should I). I’d just say it’s not in the spirit of the Megatron shader.

Curvature is maybe the outlier there as it is something I’ve been thinking about. The thing I’m thinking is to implement curvature in a way that is hopefully closer to a CRT - although you can never really get there at least not with resolutions close todays resolutions or without something like VR.

The big thing I’d look to resolve is why we’re getting moire if we use curvature. If you look at an actual CRT that are all curved we obviously don’t get moire.

You could argue it’s to do with the use of discrete pixels and maybe its a contributing factor but I’m thinking it’s maybe more to do with the way these shaders are implementing curvature.

Blurring the hell out of the image to overcome moire is not something I think is a good solution that I’ve seen some threads argue.

2 Likes

Ngl, I usually agree with your takes, but moire is definitely a thing that happens on actual CRTs. (It just varies on how prominent it actually is; aka how much you can actually notice it.)

I will say the less prominent the scanline visibility is on a CRT the less likely it is to present visible moire.

4 Likes

The moire on a CRT is due to completely different factors and looks different, though. It’s also usually not nearly as noticeable.

3 Likes

It can, it can also show up on film exactly like we deal with it in shaders iirc.

I’m just saying it’s definitely a thing that happens, to say it doesn’t is wrong.

And honestly, I’m pretty sure I’ve screamed to the rooftops what is causing the moire we tend to experience in shaders, repeatedly over multiple different threads. (Do I have a solution, nope.)

4 Likes

I would add that it’s not limited to curved presets either. Anywhere you have multiple fine high contrast grids or sets of lines or fine patterns close together to resolve the illusion can occur. Viewing distance also plays a role in the appearance of Moire. It tends to be more noticeable for me beyond a certain distance from the screen.

Some of the patterns which are competing for “space” include our scanlines, masks including faux grill wires as well as the black space and general pattern our LCD subpixel layouts and pitch create.

It’s probably my Achilles heel when it comes to trying to get certain presets looking right when scaled down to fit certain overlays. Lots of compromises in scanline gamma had to take place.

3 Likes

Basically a fancy way of saying what I’ve been saying.

High contrast patterns cause optical illusions, brain and eyes go brrrrr.

Honestly at this point I’m starting to believe CRTs relied solely on optical illusion black magic to function correctly :joy:.

3 Likes