Sony Megatron Colour Video Monitor

Hmm if you are seeing the menu go brighter and that is happening when you press F1 during the emulation i.e game you’re playing then I’d say the problem isn’t Windows 11 (although you might want to try going back to Nvidia 471 drivers as there are also HDR issues with later drivers).

I’d go with the idea that either you don’t have the correct version of RetroArch (no current release will support this shader yet so you would need a nightly build/build it yourself or wait until 1.9.15. Or you need to change your peak and paper white luminances (shader param’s) to better match your monitor.

Oh, sorry I didn’t see that part. I’m on RetroArch 1.9.14

Ah yes sorry for the confusion - it’s good we’ve got to the bottom of the problem though!

If you are willing to be a guinea pig (and possibly unstable build) then you can get the most recent nightly build here:


So I managed to get your shader to work with this nightly I definitely see that HDR is working however it also works for pretty much every other shader so I’m not really sure what you’re trying to accomplish, if you don’t mind my asking. Here’s a picture of me running guest advanced using dx11. Looks great! Sadly HSM never loaded after waiting for a reasonable time. :pleading_face:

I should also add that dx11 seems a lot more stable now! :+1:

With HDR 100% Mask is usable.


the idea is that it is very basic code that runs very fast in a single pass but looks real/natural just because of the HDR instead of needing 10+ passes and a super-powered GPU to simulate bloom, etc.


So maybe on a superficial level it looks similar (aperture grille mask and scanlines) but I can assure you it’s much more accurately mimicking a CRT monitor.

What all the SDR shaders are mainly doing to try and mimic a real CRT is blurring the image in various ways to produce a faux brightness by lighting up more pixels or making nearby pixels take on the colours of other nearby pixels.

This is not what an actual CRT does, it simply excites/lights up RGB coloured phosphor (much like your monitor natively does) and most of the lighting effects comes from the fact it’s very bright.

The bottom line is that you need a HDR display to accurately match an actual CRT - at least a professional one that I own.

When you see my PVM play the intro to Street Fighter Alpha 3 you can see distinct flashes (of brightness) that you simply do not/won’t get with an SDR monitor.

You could try playing around with Contrast/Paper White Luminance/Peak Luminance levels in the shader particularly Paper White Luminance.

One thing I’d also say is that maybe your monitor isn’t bright enough (as from memory you’re using a C9 OLED) - OLEDs aren’t the brightest and there’s a difference between whole screen brightness and local brightness and these types of shaders maybe playing havoc with the OLED monitors burn in prevention algorithms etc. However I don’t know not owning an OLED.

Another thing I’d say is possibly Mario isn’t the best test case as it’s got only 12 colours on screen at once. Maybe try something a bit more colourful.


Just one more thing I changed the default contrast last night to much more accurately match the colours of my PVM. You might want to try that as well but it’s probably not going to have much effect on the brightness levels.


For a PVM look I would do the same. No need complex filters to fake an early 80s CRT, just some cheap sharp filter, and a simple scanline code and some extras like white point etc. As I did in fact in the other thread that I did something similar, but trying to keep some luminance for normal monitors.


Im not sure on the ‘early 80’s’ bit - PVMs throughout the years all use the same basic aperture grille design so a 2000’s PVM/BVM will be doing the same just with higher resolution and brightness.

For lower end CRTs that use shadow masks and slot masks they still do much the same thing although due to current resolution limitations (4K) trying to mimic thier masks you maybe could argue there’s more of a need for more complicated setups but I’d argue you need more resolution and less blurs.

I do agree though that my shader could do with more controls over colour balance with white points etc

From what I’ve seen of most shaders they don’t really mimic what an actual CRT would be doing with all sorts of blurs and noise generators being applied before and after the scanlines and masks. Don’t get me wrong they are very impressive but I’d argue we were forced into using all that because the LCD display technology wasn’t quite up to the task.


I mean that super blurry shaders are not what normal RGB CRTs look like (arcades all were RGB), i bet this is coming from NTSC that as far as i know the CRTs there didn’t have an RGB scart. Even so, my Trinitron CRT looks crisp on Composite, and of course crystal clear on RGB. I would need a minimal blur on a shader to match that.

1 Like

Most of the time the highest we could push a console in the US without some modding of something, either the console or TV, is S-Video. (And in most cases people were probably just running their consoles via rf either directly to the TV or through a VCR/VHS, in the US. Pre-PS1 anyway)

But ye alot of the CRT shaders and presets are overtly blurry. My presets as well :joy:


Do you mean all arcades used slot or shadow masks? I’m not sure NTSC has much to do with the blurry-ness - it would certainly effect how far apart the scanlines were. Although when I went into my local arcade a month ago I noticed the scanlines were so close they merged into one another and little/no blank space (dulling) between them could be seen. That’s something that could be mimiced with the above shader by allowing scanline widths of greater than 1.

Certainly the signal going in via composite, RF, s-video and other cable/signal types could produce noise - it’d be really easy to add that to this shader as a first stage before the inverse tonemapper and hdr10 conversion. Do we have ‘composite/RF/s-video signal’ noise shaders?

Yeah I had an Atari 520 STE at some point (and Amiga always with an RGB) , with RF and that looked much much worse than blurry shaders lol. Absolute crap. Couldn’t stand playing for more than 5 minutes without my eyes bleed :joy: I would pay if it looked like crt-geom lol

1 Like

You would’ve undoubtedly been using PAL with your Amiga too - at least that was the best way to view Amiga games because all (ok most) of the developers were from PAL territories. NTSC and Amiga was just bad as PAL with the consoles (possibly worse).

1 Like

Yeah, check out ntsc-adaptive in the ‘ntsc’ directory.


Oh yes PAL. Fantastic machine. Coming from 8bits I was blown away with video and sound capabilities. I still have it driving my Trinitron :wink: (1084-SP dead long ago).

That’s interesting so what exactly does this shader do in terms of noise generation?

Are you sure this shader is mimic-ing an RF or composite etc signal - I can tell you from first hand experience that PAL was terrible through RF etc as well! :joy:

It does NTSC modulation and demodulation, chroma/luma crosstalk and fringing.

There’s also a really good PAL one in the ‘pal’ subidirectory from r57shell.

1 Like

One thing I should mention is that for HDR presets, shaders need to have something like what guest shaders have (DCI-P3) color gamuts because otherwise it will be oversaturated.

This is intriguing. I definitely don’t know enough about the reasoning behind certain passes for shader presets to fake the CRT look.

Ah ok that’s interesting - the backend of a monitor is an area I haven’t much thought about tbh but we should be able to reuse all the stuff that’s already been written as the inverse tonemapper and hdr10 is kind of mapped onto the electron gun/phosphors section of the TV.

I’m still a little confused though as I had PAL TVs back then with RF and composite and s-video connections which all produced different results. It doesn’t sound like a NTSC or PAL shader would cover that part? Am I wrong in thinking this?