OLED and CRT Shaders

Hey there,

TLDR; How much do OLEDs mess with CRT shaders?

Always been a fan of CRT & NTSC shaders and all that to help get older games looking a bit more like they used to.

Trying to figure something out. Got an LG C2 recently.

CRT Royale color reproduction changes noticably based on geometry mode being on or off.

Upon reading, I’m realizing there’s a good chance this has to do with how the layout of pixels on an OLED differs from that of a typical LCD.

I’m wondering if that’s what’s going on here or if it’s just normal behavior and I’m overthinking things.

GTX 1070 (HDMI 2.0) HDR RGB @ 8bit + dithering (happens with 12bit as well; I check / test by lowering to 30hz)

No Geometry:

W/ Geometry (More vivid):

I can reproduce what appears to be the exact same behavior if I change the mask sample mode to 1, which I believe is technically lower quality.

1 Like

Yeah I’m not super surprised by this, the 3d geometry projection does additional antialiasing.

Almost all (maybe all) of the the other shaders do the mask application differently, and therefore don’t do antialiasing after the phosphor mask applied.

1 Like

I’ll be trying those Megatron shaders again. Dove pretty deep into the thread for it… For some reason (and I saw someone experiencing similar behavior towards the current tail), I’m getting significant color issues attempting to use them. Everything appears sepia toned 🥲.


Hmm, fair. I mean, it looks fiiiine, just trying to get a better understanding of things since… OLEDs apparently complicate things. Urggg I was so excited to finally have 4K, and OLED sounded like a dream for emulation with the colors and blacks. But muh CRT shaders.

This usually indicates hdr is on/working I would suggest looking through the Megatron thread to see what issues may have been solved and how.

Usually any color issues with the Megatron are not related to the oled, but more related to the hdr pipeline working or the connection settings like 4:4:4 color.

1 Like

I wasn’t really recommending the Megatron shader (even though it’s awesome in its own right) just the writings and learnings about OLED Subpixels and Mask interactions.

You can also take the easy route and try my preset pack which has been designed on and caters for LG OLED displays first and foremost.


Gotcha, yeah my bad, I did pick that up but kinda went off on my own there a bit haha. I appreciate ya :grin:, definitely got a better understanding there. Also have your shader presets, so I’ll certainly be giving them a shot.


Cool, thanks for the info and I’ll do some more reading & troubleshooting :+1:


Turned out to be… Two things.

C2 needed input to be in PC mode to avoid behind the scenes subsampling. Forgot about the need to do that…

Secondly, for whatever reason Megatron HDR doesn’t seem to like vulkan on my rig, as good color doesn’t seem to return unless in D3D11.

Royale no longer shows (to me) noticeable color vibrancy degradation sans geometry after getting true 4:4:4. I didn’t seem to have a need for D3D11 to replicate this. Anyways I’m content for now :upside_down_face:

Thanks again for leading me in the right direction.


C2 HDMI Input needs to be in PC Mode, Picture Mode needs to be Game or HDR Game.

HDMI ULTRA DEEP COLOR should be enabled on your HDMI input.

Colour Format should be RGB 4:4:4 Full.

Aspect Ratio should be Original.

Sony Megatron Colour Video Monitor can be tricky, why not post a question in the thread? HDR needs to be enabled in RetroArch for Megatron to work properly in HDR mode. There are HDR and SDR presets in Megatron as well.

You also needs to set the whitepoint and paper white settings correctly. Also try different presets.


Mm, should be all set on those settings (though I’ll double check). I think we’re good. I am sad to find out my GTX 1070 doesn’t quuuite handle 4k mega bezel with reflections when integer scaled up a bit. Sad, but not surprised (old 2k card that honestly was never that amazing with 2k lol). Sometimes you can get away with a lot on the GPU performance side of emulation. Just gotta turn the reflections off :upside_down_face:

I am thinking about posting some pictures in the Megatron thread to see if I’m at least in the ballpark of how things should look. I did mess with several shader parameters, including the Paperwhite. Just not quite sure if I’m there yet with it.

1 Like

I am the creator of CyberLab MegaBezel Death To Pixels Shader Preset Pack I use a GeForce GTX 1070 at 4K exclusively. Mind you my GeForce GTX 1070 is a mini ITX card with 1 fan and not the highest ClockSpeed in the world by the way. The highest it reaches is probably 2,000MHz. I play from older consoles/Arcade games up to Dreamcast, PS1, N64 and Wii at 4K flawlessly using my 4K_Optimized presets.

What CPU are you using? RAM speed? You might need to setup monitoring software for example MSI AfterBurner and RTSS in order to properly identify where the bottleneck in your system is.

You can take a look at these videos to get an idea of how things run on my Geforce GTX 1070.

Example videos:

This is what CyberLab Turbo Duo for Blargg + Blargg_NTSC_Turbo_Duo_SNES_PSX_S-Video_CyberLab_Special_Edition looks like!

CyberLab Turbo Duo for Blargg + Blargg_NTSC_Turbo_Duo_SNES_PSX_S-Video_CyberLab_Special_Edition

This is what CyberLab SNES looks like!

CyberLab SNES

This is what CyberLab Genesis for Blargg + Blargg_NTSC_Genesis_S-Video_CyberLab_Special_Edition looks like!

CyberLab Genesis for Blargg + Blargg_NTSC_Genesis_S-Video_CyberLab_Special_Edition

This is what CyberLab NES for Blargg + Core Blargg NTSC S-Video looks like!

CyberLab NES for Blargg + Core Blargg NTSC S-Video

CyberLab NES for Blargg + Core Blargg NTSC S-Video

CyberLab NES for Blargg + Core Blargg NTSC S-Video

You can use MPC-HC, MX Player or VLC Player to view the videos.

1 Like

Hmm interesting.

Don’t believe it’s my other hardware.

GPU: EVGA GTX 1070 SC ( SC+ iirc? Long time ago :stuck_out_tongue: ) Hits just under 2000MHz core clock usually.

CPU: AMD 7700x

RAM: 6000mhz DDR5 (forget the timings ATM but they’re at least decentish)

I suspect it’s definitely my GPU struggling here… Outside of some complexity interacting with it indirectly, but it was hitting 97-100% core load. I had confirmed it with hwinfo.

It only happens when I increase the integer scale parameter in the shader parameters. It’s full 60 until I do that. I’d need to do some more testing, but it was happening on several presets from your pack (which so far have all looked wonderful btw). Console specific ones; was using the composite SNES NTSC II one last when I replicated the performance drops. Choosing the exact same preset sans reflections brought me down to roughly 70% GPU usage with integer scaling up 3x.

I’ll definitely check out the videos.

1 Like

I am trying to maximize the game vertical fill fyi. The bezel gets cut off a bit.

There’s something cool about dimming the frame to near black and turning the opacity of the background to 0. It looks really viby with the reflections (to me heh). Even with the bottom and top of the bezel missing mostly.

Primarily, I want the content itself looking nice. Anything with the bezel is highly secondary. I could honestly live without it if there’s an optimal way to get just the picture. Which if I’ve gathered correctly is turning the frame off and background opacity to 0.

My understanding is that as long as I stick to multiples with integer scaling I’m good. Perhaps the reflections just get more taxing as things get larger?

1 Like

Oh my goodness… I completely forgot about that as well. D:

I’ve always wondered why Royale gave me weird tints when changing the number of triads. Maybe this is the reason. I gotta enable PC mode before I forget.


Beginning of my struggles there haha.

I’ve been switching between the different compromises of HDR with HDMI 2.0. So many differing opinions.

For the passed month or so I settled with YCbCr 4:2:2 10/12bit, based on the fact this is, to my understanding, what last gen consoles output.

This is the first time I found pretty damning results; messing about with CRT shaders. 4:2:2 was what was initially causing me to get bad magenta tinting (and other colors depending on triad size).

Switching to YCbcR 4:4:4 8 bit w/ dithering or RGB full 8bit with dithering fixed the tinting problem big time. No more terrrrrible tinting… but still not quite right. The catalyst for this post heh.

PC mode did it there.


Okay, if you list the exact presets (full names) and settings you try that are running too slowly, I should be able to test on my end and see if I can reproduce the issue.

Also, what core and core settings are you talking about?

Why are you increasing the Integer Scale %, why not just increase the offset instead? Not sure why you’d need to do that with the newer Neo-GX presets as well as they’re pretty much at the max that will allow for some bezel all around. You can use Crop % settings to get rid of black bars in certain games.

Also, not sure if you also enabled Curvature or 3D Curvature but that can hit performance kinda hard.

The clearer and more specifically you explain, the easier it will be for me to get to the bottom of this.

It sounds like something that can be resolved.

If you don’t need the Frame, Bezel or Reflections you can use my presets in the MBZ__2__PERFORMANCE_NO_REFLECTIONS Folder for a significant performance boost.

You can even use the MBZ__1… or MBZ__3… folders and you won’t be missing out on much image quality wise but you can get additional performance over the MBZ_0… presets.

As a matter of fact you really should try my Neo GX presets, which are my latest as they all are in the MBZ__1… performance tier (even the ones which are included in the MBZ__0…folder)

I use RTSS Frame limiter at 60fps and I’ve also overclocked my GPU using MSI Afterburner Automatic Curve Overclocking with a memory overclock of about +200MHz.

1 Like

This may all be normal and/or just a product of me messing with things that don’t need messing about with, but here are my findings for the sake of science.

Current personal preference on the integer scaling; want the picture to be vertically maxed even if a small amount is lost. Whatever integer scaling multiple gets there. 4k helps get it basically right there compared to 1080p. I don’t mind losing the bezel top and bottom much. Still think the left and right reflections look neato. Definitely not standard user there :stuck_out_tongue_winking_eye:

Definitely point out where I may have lost the plot or need for further clarity.

Note: More not 2:00am testing shows that I actually get not so great performance in the below shaders, even without touching integer scaling, until I put my vulkan image Swapchain from 2 -> 3, giving the GPU some more breathing room. Perhaps that’s where our configs differ here?

Core: BSNES (non-2014)

Core settings: Stock. Haven’t touched them.

Driver: Vulkan

Vulkan image Swapchain: 2

Integer Scale: Adjusted via multiple offset shader param, disabled in RetroArch

Aspect Ratio: Full

Retroarch: Tried current nightly and Stable (1.15.0)

Curvature: Whatever it is in those presets; didn’t manually enable it for testing.

All in MBZ_0

Shader: CyberLab_SNES_Composite_Shadow_Mask_Smooth_IV_OLED_II

Sub 60fps (audio crackle as indication). Integer scaling doesn’t make it worse as far as I can tell. Also, integer scaling doesn’t change the scaling of the bezel here… So reflections aren’t changing (in case performance loss is due to increased sampling or something in the following shader?)

Shader: CyberLab_SNES_Composite_Slot_Mask_IV_OLED_NTSC_II

Performance fine at stock settings. Degrades as integer scale offset increases. Swapchain images at 3 doesn’t quite get above 60 consistently, though it is improved. I have to enable integer scaling for this shader notably compared to the others, and it affects the bezel scale. So to me, that sounds like I’m potentially coasting further into “stop mucking about with things, you don’t want to do that” territory. But 🤷

Shader: CyberLab_SNES_Composite_Shadow_Mask_Neo-GX_Ultra

Performance sub 60 with stock settings. Integer scaling has no affect on performance. Doesn’t affect bezel scale either.

D3D11 also performs fine with all shaders except the slot mask one there when I manually enable integer scaling in the shader params and go to 2x multiple offset.

1 Like

Stock settings including Vulkan Image Swapchain?

What about things like Preemptive Frames and Frame Delay or BSNES’s internal Run Ahead?

Are you using the latest Mega Bezel Reflection Shader v1.14 from the Mega Bezel GitHub?

You can try backing up your RetroArch.cfg file, then deleting it and starting over your RetroArch configuration following the Mega Bezel setup instructions to a tee.

When I get a chance I’ll see if I can run the same settings that you’re trying to accomplish and give you some performance numbers or maybe even a video clip.

Any particular game you’re trying to run that’s not performing well?

Other components in my system include an AMD Ryzen 5 5600X, 32GB DDR4 3200MHz and I also have a second GPU running in SLi with the first one. This wouldn’t help performance in RetroArch and could actually hurt it because there’s less airflow to my primary GPU and my PCI-E x16 slots are running at x8 instead of x16.

Here is some performance info from my system as well as the parameters you’d need to change to get the image to be filled to the top and bottom of the screen using my CyberLab_SNES_S-Video _Shadow_Mask_Smooth_Neo-GX_Ultra shader preset.

Sometimes older graphics cards can develop cooling issues as their thermal paste dries up. Also case airflow could be a factor, so it’s something to also check out. These cards start throttling at around 80°C by default.