I found another issue with newpixie rolling scanlines. Some scenes just look too dark. Under brightness settings, I set gamma correct to 2 and it seemed to fix it.
EDIT: Well, sort of fixes it. Now in brighter scenes it doesn’t look right lol.
I found another issue with newpixie rolling scanlines. Some scenes just look too dark. Under brightness settings, I set gamma correct to 2 and it seemed to fix it.
EDIT: Well, sort of fixes it. Now in brighter scenes it doesn’t look right lol.
Newpixie is generally darker than most shaders. Even the standalone suffers.
Did you try the “Bright Boost Bright Pixels” in the Brightness Settings?
Bright boost dark pixels seems to help in dark scenes also, but also makes brighter things too bright.
So this is probably related to the contrast adjustment I added. Go to the Grade settings and set contrast to 0 (I had it set at 0.5).
I’ll probably set this to 0 on the next iteration. 
Yeah that definitely helps. Going into the negatives improves it even more but then causes issues again with brighter scenes. Like Duimon said, if it’s just a dark shader I guess there’s only so much that can be done.
Hi! I’ve been loving these shaders for a few months now, and I have a question: Is there a simple way to keep all the curvature and reflections, but disabling the CRT and/or scanline effects? I noticed that for streaming, it’s better to have sharp pixels and I was wondering if I can turn off only that part of the shader. Thanks!
Glad you are enjoying them!
To get a sharp pixel look you should set both of the following parameters to 1 and 100:
[ CRT vs ORIGINAL A/B COMPARE ]:
If you want to just drop it into an existing preset you can add these lines:
HSM_CRT_ORIGINAL_AB_COMPARE_ON = "1.000000"
HSM_CRT_ORIGINAL_AB_COMPARE_ON = "100.000000"
I wanted to share something cooking in the HSM kitchen
I’m trying something which splits the image and pushes a copy to each side of the viewport space. For this demo it’s being used in conjunction with the new Full aspect mode I added to Retroarch which should be in the next version. The full aspect mode basically makes it so that the viewport takes the entire space of the window (or full monitor if in full screen mode)
So this means that you can create a 16x9 background image and set it to split and it can still fill the sides on a 32x9 monitor.
The full aspect ratio mode also means that if you are using the mega bezel you don’t need to change your Retroarch aspect ratio when moving between different monitor aspect ratios, e.g. Landscape vs Portrait.
This fantastic @HyperspaceMadness !! This needs to exist and I believe, it has to run with full screen, to get even better right?
Yes, you can certainly run it full screen, that’s how I expect most people will be playing.
I did the example with windowed because it shows what it’s doing more clearly 
i have not been here in a wile, what you have been doing since is unbelievable. absolutely freaken amazing…
I’ve been thinking that the Mega Bezel needs a new original default background for the shader instead of the SNES-looking one.
So I’ve put together a 2-minute Inkscape v0.1 concept myself of a non-existent CRT Television.
It’s under the CC BY-SA 4.0, feel free to do whatever you want with it.
Mega Bezel is so good with my current setup that I could actually see not replacing my CRT if it breaks. I don’t think any pictures can do it justice. Such great way to play these games on a big 16:9 screen.
Out of curosity, what model of graphic card do you own?
I’m not sure if you mean me or BendBombBoom, but I use a RTX2060.
At 4K for a 16bit game (E.G. Genesis, SNES) I get this performance
I have an nvidia 970, and a 1080p tv, i guess i fall kind of short in hsm mega bezel hardware needs right? BTW if i want to use HSM with a 4k tv with this same video card, is it recommended to send a 1080p signal and let the tv upscale to 4k? or lots of detail are lost and not worh the effort?
For running mega bezel advanced 4K I think you are right unless you are running a Basic preset.
You can totally send a 1080p signal to the TV and the graphics and everything will still look ok, the only thing you need to be a bit aware of is that when something is upscaled like this it actually ends up being blurrier than if it was running on a 1080 TV.
In short, 1080p sent to a 4K tv will be more blurry than 1080p sent to a 1080p tv.
The other thing to be aware of is that if you do this with something which has strong masks you can get some strange looking stuff because you get some interpolation of the masks, but this may not be a problem.
I would suggest just trying it and seeing how you like it.
3080 attached to a 4k screen, and a 2060 laptop with a 1080p screen.
A question and suggestion. Question: Do you think the latest GDV will be added with the NTSC passes? I haven’t seen the rain-bowing and artifacting in the other Mega Bezel presets or options.
Suggestion: I like GTU but it’s too extreme/blurry on DreamCast content. A multiplier option when it hits above a certain resolution would be useful. It could be linked to the fake scan line resolution threshold.
I will probably try to, the issue I have right now with the GDV NTSC is that the GDV shaders (crt-guest-advanced.slang & linearize.slang) are different than the GDV NTSC shaders (crt-guest-advanced-ntsc.slang & linearize-ntsc.slang), rather than one shader with some parameter adjustments. So it is possible for me to maintain 2 versions of my adjusted GDV shader, but the bigger problem is that they require different shader chains. I would really like to be able to add NTSC to the chain so that it’s something that can be completely turned on or off, otherwise it means more base presets and less flexibility.
@guest.r could you chime in on what they think about the adjustments in crt-guest-advanced-ntsc.slang and if they could be folded into crt-guest-advanced.slang, or what the reasoning is behind keeping them separate.
I do have NTSC Adaptive passes set up to swap out with GTU, I’m not sure if NTSC Adaptive is doing enough for what GDV-NTSC is currently able to do, I’ll probably make an experimental preset with NTSC-Adaptive in the next release. The other question is would people be happier with NTSC-Adaptive or would this lose what they like about GTU.
The setting on GTU is just the default, so we could perhaps change the default to be something better, do you think something like you would use for Dreamcast would be a good default? What do you use for this?
But if what you really want is different values for 240p and 480p content then we could definitely add a multiplier which would come on when we reach the trigger res 