Mega Bezel Reflection Shader! - Feedback and Updates

Is there a way to get this working on a pi4? I don’t see the option to select the Vulkan driver anywhere. I have GLCore, but for some reason none of the shaders show up when I select that.

1 Like

You need to save the glcore driver as the default and then restart RA to see the shaders.

That being said, I don’t think the pi4 has enough juice to run the Mega Bezel.

Even a decent PC without a discreet GPU, (Except in rare cases.) struggles to run it.

I tried that but it doesn’t show for me still, for some reason. Bummer. If it’s not beefy enough then I suppose it doesn’t matter. Thanks, though!

1 Like

You are welcome. If you would like to try my graphics without the Mega Bezel, they are in a GitHub repo.

https://github.com/Duimon/Retroarch-Overlays

In the “logo” folder.

If you are looking for a painless setup, modified versions are the default in Batocera OS v33.

2 Likes

@HyperspaceMadness

Something that’s been bugging me for a while, maybe it can be fixed in the next release:

Phosphor gamut (under grade) should be set to 0.00 by default. You need a wide color gamut display and you need to re-calibrate the display in order for any of the phosphor gamuts to look right. It’s really advanced stuff. For 99% of users it’s just going to make the image worse.

Also, GDV’s color temp control should be re-added, as grade’s only works when you have a posphor gamut enabled.

Also, I strongly feel that vignette toggle should be at 0.00 (off) by default, since this is an artistic thing not directly related to color management. Always kinda wondered why it was in grade in the first place.

3 Likes

Thanks for the feedback :smile:

Hmm, so I thought that this was set up to be the default of what was in the standard grade.

I thought this was there, I do mean for it to be there, I’ll check on this.

So the Vignette is on by default, but it’s not the vignette in grade, it’s applied afterwards in the post crt step where I do all the tube effects. AFAIK the vignette in grade is disabled and the parameters removed.

The main thought behind having the vignette on by default is to help the screen “integrate” with the surrounding graphics, I’m not sure if many users feel the same way as you or not. I don’t think I’ve yet gotten complaints about the vignette, compared to the tube static reflection which I got numerous requests about how to get rid of it, and I subsequently turned it off by default.

3 Likes

Suggesting that this needs to be “fixed” infers that there’s something that’s broken or wrong with the way things are currently setup. I don’t necessarily agree that there’s something wrong with the way things are currently.

Users are already free to adjust these settings to suit themselves, their TVs/displays and their needs. The ones that wouldn’t bother might likely just use presets in which the preset creator has already preadjusted it.

Has this been empirically and scientifically determined or is this just based on anecdotal information, a hunch or your personal opinion and feelings?

Perhaps a poll might be a good starting point if one is to make such determinations that affect others, even if one’s intentions are sincere and genuine.

The vignette is nice man. You don’t like the vignette?

Keep expressing your vision and your sense of style, creativity and nostalgia in your work man. There are already other projects and threads which are solely focused on “accuracy without compromise and at any cost” that require the user to have special TVs/displays, relatively extreme settings (to get the most out of them) and thus may not really be as accessible to the general public. It’s good to have choices as there are many different views as to what looks best and represents the “true” CRT experience for different people.

I generally use whatever tools are provided to me by my favorite shader creators to the best of my ability to get things the way I want them to look. If I see something that doesn’t seem to work right or as intended, I bring it to their attention.

For me it’s about balance.

You need a wide color gamut display and at the very least you need to re-calibrate the display in order for the gamuts to not appear over saturated and clipped, as per Dogway. You’re free to dig through the Dogway grade shader thread for the relevant posts if you like. 99% of users are not going to recalibrate the display and/or do not have a wide color gamut display. Which brings us to….

It’s for this very reason that phosphor gamut should be set to 0 by default.

It’s not about whether or not I think it looks nice, it’s about the purpose of the grade shader, which is… to grade. IIRC Dogway added it purely because it was convenient with a particular CRT Royale setup he was using at the time, or he wasn’t happy with the existing vignette effects and there wasn’t a better place for it, or something. It doesn’t really fit with the overall concept of the shader, it’s kind of tacked on. Anyway it’s completely moot since according to @HyperspaceMadness it’s been turned off by default.

1 Like

Yes, but the default of the standard grade is set with the expectation that the user will create a LUT for the monitor for use with a particular color space and gamut, which IIRC is why gamut 0.00 was eventually added…

I was fairly involved with the discussion around the grade shader as it was being created and tested, if you recall.

1 Like

Ah ok then we should definitely change it to 0 as I don’t expect people to do this.

I’m not sure if this is moot or not, because a vignette is still added in the Mega Bezel later in the chain.

I double checked this, and Guest’s color temp controls are still in there. Let me know if there’s some particular situation where you’re not seeing them.

2 Likes

It’s moot in the sense that I was only concerned about vignette being part of a color grading shader especially since it was kinda just tacked on there by dogway, I think he ran out of room for parameters in a shader setup he was using? This was like 2 or 3 years ago. It’s fine to include in the bezel shader since it fits within the concept and scope of the bezel shader.

Is it under “Color Temp and Hue?”

If so, it’s acting just like grade’s color temp control. It doesn’t work (it does nothing) when phosphor gamut is set to 0.00. GDVs on the other hand works regardless of what the phosphor gamut is set to.

1 Like

@HyperspaceMadness

Re-posting in case you missed my ninja edit.

2 Likes

Ok so Grade’s color temp only works if we are using one of the phosphor gamut which is not zero, and If the user uses one of these phosphor gamut they must use a LUT for it to work correctly?

If so, this isn’t a great setup for the average user as I imagine only like 5% or less would use a LUT :worried:.

No, I think the guest color adjustment is in a section called Color Tweaks

2 Likes

correct. It’s really for the advanced user.

I see it now, thanks!

2 Likes

@HyperspaceMadness: Hi! I finally upgraded to the latest. What a performance boost! Amazing work as usual. Previously, I had to use DOSBox Pure with this preset “STANDARD HD-CORE” from the base_crt_presets. What is the current equivalent of that one? Thanks!

(Duimon’s DOSBox preset uses “MBZ__1__ADV__GDV”, but won’t load in DOSBox Pure.)

2 Likes

Great, that’s awesome to hear that you’re seeing the performance benefits :smile:

The STD should be equivalent to the old HD-CORE presets. :slight_smile:

Good luck!

Also there is a table with the equivalent old vs new presets that you can refer to in the readme, although the HD-CORE presets aren’t on there

3 Likes

Thanks for the quick info! Running Quake with Duimon’s DOSBox graphic and your HSM shader looks epic!

Concerning the performance, I can now run almost all cores with advanced presets. And this is with a PC going on 10 years old (GTX770). :crazy_face:

4 Likes

Ever wonder if it could be something pretty simple?

I think it would be a significant investment in time to debug the D3D loading time problem, with an unknown probability of success.

For the forseeable future I feel like there are better things to spend time on which will be more fruitful.

4 Likes

Even on an RTX 3080, Vulkan stutters on Dolphin. It’s a shame that the only way to get a stutter-free experience, at least on my machine, is by using the D3D driver on Dolphin. Come on nVidia!

2 Likes