CyberLab Death To Pixels Shader Preset Packs

Yep, that will work, but you will be cropping the screen a bit. You may also notice that you can now see the CRT through he Bezel Inner Edge Highlight.

In any case, we have far more options with the Mega Bezel than anything previously available!

So the short answer to your original question is… no, I don’t have plans to create a new preset using Cyber’s work at this time. Sometime in the future I may choose a favorite and do so, who knows?

There is a chance that at some point we will be able to simply add a second reference preset and chain the two together. It has been talked about.

Again, many things take precedent. :wink:


Yep just noticed that and changed the Scale offset back to 100. Change the [Bezel] Width to 48 and Height 38 instead.

Preferences change from person to person but patient is always needed for everyone. New features are always great and looking forward to everything that will be coming in the future. I guess for the time being I can just use the presets that I prefer.


If you choose to create your own custom presets. I would wait until the Mega Bezel hits v1.0.

The good news is that you will only have to do them once and can enjoy them forever.

Honestly, it is a joy to see anyone take advantage of the customizing features. I will assume that most will just use the supplied presets as is.


Especially since the vast majority wouldn’t know where to start with any of it and want a plug and play style. The previous of my posts will hopefully help those who want to combine the two together.

Good luck to you and @HyperspaceMadness in figuring this stuff out.


That would be recommended to anybody that would be creating custom presets. I was honestly just stating about using Cyber’s with your graphics. I definitely would have to make sure I want to do it since it would be time consuming to edit every one of the presets.

1 Like

Wow! I’m glad to see you guys are having fun with my presets! I’ll have to read through and respond properly a little later.

Curvature = 0 is intentional. Although your screenshots look good even with it turned back on!

If you’re using BSNES/HIGAN you can try setting the internal resolution to 512 x 224 for a nice crisp, more authentic look in Super Mario World. You can also try my brand new Computer Monitor Presets.

I’ve seen a few users asking how to combine my presets with @Duimon’s graphics and I’ll see if I can include some instructions and examples and eventually a video guide.

Adding @Duimon’s graphics to all of my presets shouldn’t really be that difficult a task once you use the right tools. Most of my presets reference others in the pack so there are probably only about 2 or 3 which actually reference the Mega Bezel Base CRT presets. Once you chain those to @Duimon’s presets all the others will fall in line.

I appreciate the interest in these settings. Feel free to continue to provide feedback as well as requests!


I figured the Playstation core out enough.

Change the parameters to get it working properly.

[Aspect Ratio] Orientation: 1.00

[CRT Screen Scaling] Int Scale Mode: 0.00

[Scanline Direction] Scanline Direction: 1.00

Both the images use the Cyber Preset.

Disregard all the other changes in previous posts. Setting the Aspect Ratio Orientation is only necessary in the shaders that are going to be used with Playstation Cores with their weird aspect ratios. If you are going to use all of the shaders with Playstation Cores than just use the 3 change mentioned. Int Scale Mode and Scanline Direction are the most important changes.


This image shows that the only problem with @Cyber presets not working properly with @Duimon Graphics is the setting of CRT Screen Scaling, which I think @Duimon brought up with me in a earlier post. All the other settings are CyberLab Preset default.


@Cyber this is very impressive work, you even managed to deal with Segas dithering quirks, I thought impossible. Very pretty.


Thank you very much @Birm. My current personal favorite preset in the pack is my Computer Monitor - Sharp preset.

You can play around with the MDAPT setting if you want that one to handle blending and transparency as well. I suggest setting MDAPT to 2 for most games that use dither patterns but 3 for games like Sonic The Hedgehog.

When finished adjusting, you can save either a Game Preset or a Core Preset. I also recommend you get familiar with the Cropping settings to eliminate black borders or screen junk in some games. After you’re satisfied with your Cropping settings save a Game Preset.

Be sure to spread the word that these exist so that others may also benefit!


Where can someone find these awesome looking bezels?


This is the first day of the rest of your life!

Duimon - HSM Mega Bezel Graphics and Presets - Feedback and Updates

CyberLab Mega Bezel Death To Pixels Shader Preset Pack

HSM Mega Bezel Reflection Shader

SOQUEROEU - Mega Bezel TV Backgrounds

My Realistic Arcade Bezels

TheNameC - Mega Bezel COMMODORE Pack

Happy New Year!


Hi there!

Im trying to use this pack with beetle psx hw core and it seems to crash or softlock retroarch on anything higher than 2x internal resolution. This is on a fresh install of retroarch 1.9.14

For what its worth, I tried the other beetle psx core and got the same result. I cant get it to work at all with the duckstation core but it will work with the pcsx rearmed core however, I cant seem to find any internal resolution settings for that one.

Im more interested it getting it to work with the beetle psx hw core for texture replacement, I just thought the extra bit of testing with the other cores might help though


It’s likely that you are running out of gpu resources. This often happens when running at a higher than native resolution, E. G. 4x. Basically the shader chain gets a large image coming from the core and resource usage goes through the roof.

If that is the case then you can try one of the following:

  1. Use downscaling in the core options to bring it back to native after the upscaling
  2. Try a STD preset instead of an ADV Preset, because the ADV presets scale up at 3x in the middle of the shader chain
  3. Try a DEREZ-480p preset which will downscale in the shader chain to 480p
  4. Run the core at native res :frowning:

Thanks @HyperspaceMadness! Would this be a GPU memory resources issue? I wonder if there’s a way we can determine which GPU’s or VRAM configurations would most likely run into problems and probably put some sort of warning in the documentation for users to know what to expect?

We can probably try to collect resolution debugging info to see at exactly what output resolutions it’s happening and with which GPUs and eventually as our database builds we might start to see some patterns.

My Pre-Scale settings are set to maximum 1,600 for most of my presets to achieve the look I’m going for. This might also be a factor.

In these cases this would probably be the best solution for use with my presets as it allows the shader effects, post-processing, scanlines and masks to do all of the filtering and upscaling work. Especially with the settings I’ve chosen.

Things should look pretty good at native resolution but I guess new users especially are coming in with their existing configurations, solutions, preferences and expectations.

I don’t think we have to work that hard.

I think a global warning is enough.

My 3070 Ti crashes at 4X on Beetle PSX HW, using the ADV.

I can get to 4X on STD but at 8X I get a failed to allocate video memory just before it crashes.

My RTX has 8GB of memory.


That 8GB on such a powerful graphics card in this market relative to the competition! Damn nVIDIA. You know it might be interesting if someone were to setup a GPU modding service to replace those VRAM chips with double density chips. There has already been similar work done on other GPUs in recent times. This might turn people’s 3070s into some serious long term powerhouses! I’m sure it can be reliably done with the right amount of research and practice.


Don’t damn Nvidia, damn the stooges that are buying up all the cards and driving up prices. I would have bought 12GB if it hadn’t costed $3000.

My point is still valid. If an RTX 3080 with 12GB of memory crashes using the STD at 16X or the ADV at 8X. It is only one parameter click away from crashing the top tier.

BTW. I can run every one of the DREZ presets at 16X.

Edit. Apparently the 3080 currently tops out at 10GB so…


The type of graphics card is still important today. The less memory it has and slower it is the less you can have on screen, of course. I have a 1650 and mine runs slow with pcsx2 on Advanced Shader.

Edit: I should specify that if I run it with a Standard shader, I can watch a movie at the same time and it will run 100% fine.

Yeah this is really all dependent on what upscaling you have and what gets fed to the shader system, I think having a warning to try to educate the users about upscaling and suggestions about what to do like I mentioned would be the best. If you use a the DREZ-480 for PSX, PS2 you will always be passing something close to native res through the shader chain.

This doesn’t actually change the resolution in the chain. Any control over resolution in the chain is defined in the preset. When you change the Core Resolution Sampling Multiplier you are just changing the sampling on the buffers which are already there.

1 Like