CyberLab Death To Pixels Shader Preset Packs

@Cyber this is very impressive work, you even managed to deal with Segas dithering quirks, I thought impossible. Very pretty.


Thank you very much @Birm. My current personal favorite preset in the pack is my Computer Monitor - Sharp preset.

You can play around with the MDAPT setting if you want that one to handle blending and transparency as well. I suggest setting MDAPT to 2 for most games that use dither patterns but 3 for games like Sonic The Hedgehog.

When finished adjusting, you can save either a Game Preset or a Core Preset. I also recommend you get familiar with the Cropping settings to eliminate black borders or screen junk in some games. After you’re satisfied with your Cropping settings save a Game Preset.

Be sure to spread the word that these exist so that others may also benefit!


Where can someone find these awesome looking bezels?


This is the first day of the rest of your life!

Duimon - HSM Mega Bezel Graphics and Presets - Feedback and Updates

CyberLab Mega Bezel Death To Pixels Shader Preset Pack

HSM Mega Bezel Reflection Shader

SOQUEROEU - Mega Bezel TV Backgrounds

My Realistic Arcade Bezels

TheNameC - Mega Bezel COMMODORE Pack

Happy New Year!


Hi there!

Im trying to use this pack with beetle psx hw core and it seems to crash or softlock retroarch on anything higher than 2x internal resolution. This is on a fresh install of retroarch 1.9.14

For what its worth, I tried the other beetle psx core and got the same result. I cant get it to work at all with the duckstation core but it will work with the pcsx rearmed core however, I cant seem to find any internal resolution settings for that one.

Im more interested it getting it to work with the beetle psx hw core for texture replacement, I just thought the extra bit of testing with the other cores might help though


It’s likely that you are running out of gpu resources. This often happens when running at a higher than native resolution, E. G. 4x. Basically the shader chain gets a large image coming from the core and resource usage goes through the roof.

If that is the case then you can try one of the following:

  1. Use downscaling in the core options to bring it back to native after the upscaling
  2. Try a STD preset instead of an ADV Preset, because the ADV presets scale up at 3x in the middle of the shader chain
  3. Try a DEREZ-480p preset which will downscale in the shader chain to 480p
  4. Run the core at native res :frowning:

Thanks @HyperspaceMadness! Would this be a GPU memory resources issue? I wonder if there’s a way we can determine which GPU’s or VRAM configurations would most likely run into problems and probably put some sort of warning in the documentation for users to know what to expect?

We can probably try to collect resolution debugging info to see at exactly what output resolutions it’s happening and with which GPUs and eventually as our database builds we might start to see some patterns.

My Pre-Scale settings are set to maximum 1,600 for most of my presets to achieve the look I’m going for. This might also be a factor.

In these cases this would probably be the best solution for use with my presets as it allows the shader effects, post-processing, scanlines and masks to do all of the filtering and upscaling work. Especially with the settings I’ve chosen.

Things should look pretty good at native resolution but I guess new users especially are coming in with their existing configurations, solutions, preferences and expectations.

I don’t think we have to work that hard.

I think a global warning is enough.

My 3070 Ti crashes at 4X on Beetle PSX HW, using the ADV.

I can get to 4X on STD but at 8X I get a failed to allocate video memory just before it crashes.

My RTX has 8GB of memory.


That 8GB on such a powerful graphics card in this market relative to the competition! Damn nVIDIA. You know it might be interesting if someone were to setup a GPU modding service to replace those VRAM chips with double density chips. There has already been similar work done on other GPUs in recent times. This might turn people’s 3070s into some serious long term powerhouses! I’m sure it can be reliably done with the right amount of research and practice.


Don’t damn Nvidia, damn the stooges that are buying up all the cards and driving up prices. I would have bought 12GB if it hadn’t costed $3000.

My point is still valid. If an RTX 3080 with 12GB of memory crashes using the STD at 16X or the ADV at 8X. It is only one parameter click away from crashing the top tier.

BTW. I can run every one of the DREZ presets at 16X.

Edit. Apparently the 3080 currently tops out at 10GB so…


The type of graphics card is still important today. The less memory it has and slower it is the less you can have on screen, of course. I have a 1650 and mine runs slow with pcsx2 on Advanced Shader.

Edit: I should specify that if I run it with a Standard shader, I can watch a movie at the same time and it will run 100% fine.

Yeah this is really all dependent on what upscaling you have and what gets fed to the shader system, I think having a warning to try to educate the users about upscaling and suggestions about what to do like I mentioned would be the best. If you use a the DREZ-480 for PSX, PS2 you will always be passing something close to native res through the shader chain.

This doesn’t actually change the resolution in the chain. Any control over resolution in the chain is defined in the preset. When you change the Core Resolution Sampling Multiplier you are just changing the sampling on the buffers which are already there.

1 Like

My point was that the competition is offering a faster SKU with double the VRAM in this current market and with what the GPU is capable of, it possessing a mere 8GB of VRAM puts some artificial limits on its potential. With a full 16GB RAM, or even 12 or 14GB, it would have been a much easier choice inflated prices aside.

When I assess a product’s relative value, it’s not just about if I can afford it. I want something that is properly balanced. This card might be the epitome of planned obsolescence and nVIDIA is always looking for ways to try to hold back and avoid another 9 and 10 series when we got so much value across the board in addition to the significant generational performance increases.

Does this work with the XBOX Series X / UWP version? I attempted to install and apply but it did not work for me. Wondering if anyone has been successful or if its just not possible. Thanks! looks great

1 Like

I tried it on Xbox Series X and it didn’t work. It was about 5 minutes and then it gives up.

1 Like

Is RetroArch in the Xbox store finally? Just curious as Microsoft is now timing out dev mode, if you aren’t using dev mode to do dev things.

Greetings @futurepr0n! Welcome to Libretro Forums.

I haven’t read of anyone getting these shaders to work on Xbox Series X / UWP version. I don’t like to assume though so I’ll leave it at that. The HSM Mega Bezel Reflection Shader has some known issues with loading using Direct X, which don’t exist on Vulkan / GLCore. Microsoft Xbox Consoles use Direct X and don’t support Vulkan / GLCore as far as I know.

I have a strong feeling and hope that these issues will be looked at and addressed at some point in the future, which will further widen the base of compatibility and accessibility.

You’re welcome, I’m glad you like my shader presets!

Holy smokes. What kind of magic is this? I’m not lying when i say that applying your shader for the first time, it kinda blew me away. I was already impressed with the stock HSM shaders, but the CRT shader that come with it - although they are nice - they didn’t really gel with me as much as my slightly customized dr-venom preset.

The reflection bezels were incredible, so i was kinda torn. Then i saw somebody recommending your shader presets and without even realizing what they do, i just installed them along with a couple of other bezel packs without reading much of the introductory post. Once i fired them up it i was like “damn, am i crazy or why is the image so crisp?”. This is like experiencing retro games in a whole new way WITHOUT losing any of their retro charm. Beautiful work!

I think they overall experience of getting some these things to work is still quite confusing, so i’m considering writing a reddit post with a more compressed tutorial on how to get the HSM Mega Bezels to work along with your shaders and the Duimon bezel community pack, once i get my head around a few things… because i’m still confused about some aspects and i’m no stranger to modding games and tinkering with my stuff, so if i’m confused about certain steps, many other people will be as well and they won’t get to experience these glorious shaders.


Sorry, i hope it’s okay to break this up into two posts, but i got a few questions as well:

  1. Which parameter do i need to adjust to turn up the brightness a little?

  2. With Tekken 3 i encountered a strange issue, after the boot logo when the first video sequence starts playing, the aspect ration suddenly switched to almost a top/down Nintendo DS like format, or as if you turn 16:9 by 90°. So it squashed the image horizontally. I went into the shader parameters and adjusted the aspect ration manually and it worked, but unfortunately the image got all fuzzy like it’s missing half of the pixels on the horizontal axis. Any clue what the problem is here? I haven’t tried every single game in my library, but so far every other game works perfectly.

  3. I’m still very confused as to how i can use your shader along side the DUIMON bezels. I looked up a few posts in here and tried following the instructions, but it’s too vague, there seems to be something i’m missing. Is there a detailed tutorial anywhere on how to do this?


There are several parameters that can be used. There’s no free lunch though. Once you go too far with one parameter, it can mess up the image in other ways.

There’s Post CRT Brightness, Grade as well as GDV Brightness settings. Increasing Halation also increases brightness. Adjusting the scanline shape, deconvergence and also Scanline Type. Mask Strength as well.

Which of my presets are you trying to make brighter?

That Tekken 3 problem is a known issue. I’ll let @HyperspaceMadness know. It uses a strange resolution for its aspect ratio and that throws off the detection in HSM Mega Bezel Reflection Shader. I’m sure there’s a workaround available here or in the Mega Bezel Thread.

This stuff like many other things in life has a learning curve. The easiest way to begin to understand is to open up your favourite preset and look at the 1st line of code that says reference.

There should be a filename and/or path.

Open the file being referenced and look at the filename being referenced in the first line. Repeat this until you reach a filename that’s being referenced in the Mega Bezel Base CRT Presets Folder called MBZ_1_ADV_GDV.slangp.

This is your base CRT Shader Presets.

Since @Duimon’s presets and mine both use the same base preset family, you can replace the filename and path of the base preset in my shader preset with the name of whichever of his presets you would like to use.

Other than that there might be a couple other parameters to change so that everything matches up and behaves properly. @hgoda90 mentioned them in one of his posts above.

So basically what you would have is my Shader Preset passing through @Duimon’s and picking up and applying his bezels and settings on the way to the final base CRT preset instead of loading the base preset at the point where it was in my shader preset chain.