Sony Megatron Colour Video Monitor

Are Retroarch’s menus displaying in HDR?

And is the resolution set to 2160p? (Just checking since that screenshot is 1080p)

Yes, and yes. Man, that menu text is blinding. :dizzy_face: The screenshot I took was too big for the forum, so I had to shrink it.

So despite being called “default”, that preset is kind of wonky. I have a testing fork that includes more refined presets built specifically for LG C1s and other 800 nit LG panels (most recent version here). Make sure you have HGiG enabled in your display’s settings.

If you would rather tweak settings yourself: go into Shader Parameters, set HDR: Display’s Peak Luminance to 800, and HDR: Display’s Paper White Luminance to 650-670ish. Again, make sure you have HGiG enabled in your display’s settings.

You should also set “Display’s Subpixel Layout” to match your panel. (That would be RWBG if you have an LG panel other than the G5, which uses a new BWRG layout that isn’t supported in Megatron at the moment).

The color balance on that preset is also kind of red tinged. You can fix this by making all of the “Scanline Min” settings match, then making all the “Scanline Max” settings match.

1 Like

After you do all that @Azure suggested you can also try out my CyberLab Megatron Death To Pixels Shader Preset packs, most of which were developed on an LG OLED E6P.

I’m not sure if you missed the Sony Megatron Colour Video Monitor setup instructions or if you haven’t updated your Slang Shaders in a while but you’re supposed to go into Shader Parameters and adjust the Peak and Paper White Luminance Values until things look correct, including bright enough on your particular display.

When trying to convey what you’re experiencing using Sony Megatron Colour Video Monitor a photo or video of the screen might be more meaningful.

Here’s my results using your CRT Megatron RGB 709 (Aperture grille) [2160p].slangp after implementing your suggestions.

@Cyber I’ve tried out your shader. I’m pretty OCD about not having redundant settings, my preference is a preset that only does what I need it to do. Funny thing is, the more I mess around with these preset the more I’m confused by it all. I’m willing to just suck it up.

When I zoom in on the picture I posted, each individual BGR triad seems to be at peak luminance, but when I view it at its original size it’s washed out. I can try to compensate for it, but it looks worse the further I push shader parameters. The punch I’m expecting to see is what the game looks like when setting Display’s Resolution to 1080p, which looks dang near perfect but destroys the phosphor pattern.

What do you need a preset to do and what redundant settings are you talking about?

You’re probably doing too much. Maybe you should give a step by step of what you’re doing to get to this point.

If you go back and read this thread you’ll see that it’s not supposed to be that hard. You load up a preset, you go into Shader Parameters and you increase your Peak Luminance and Paper White Luminance.

What do you have your Peak.and Paper White Luminance set to?

Try 630 for both and see if it looks better.

Also, what source device are you running RetroArch on and how is that setup?

Are you outputting RGB 4:4:4 Full 8bit/10bit/12bit colour?

Did you change your HDMI input label to PC?

Did you enable HDMI Ultra Deep Colour on that input?

Are you using HDR Game Mode?

These are some examples of how my presets look:

Potential Spoiler Alert

This sounds like you aren’t getting 444 chroma.

If you are using an LG TV, you need to set the HDMI icon to PC and enable HDMI Deep Color (General>Devices>HDMI Settings on an LG C1), in addition to having 444 enabled system side.

1 Like

Pardon if this was talked before in the thread, but a quick search turned up nothing.

I wanted to try this amazing shader now that HDR works on the major Linux desktops (I’m using the KDE Plasma desktop). KDE has a neat HDR calibration tool in the monitor settings, where you set your max luminance and paper-white value. The usual stuff.

After setting those values up, I set the same inside RetroArch on the HDR menu, and then I set them again on the Megatron parameters (so, I set them in 3 places in total).

But I noticed the colors are washed out and the brightest values seem to be clipping into each other. The overall picture looks very wrong.

Then I went again to the KDE monitor settings and set the HDR max luminance to it’s maximum allowed value (2000 nits), and now Megatron looks much much nicer!
Perhaps double tone-mapping is happening? Once in the shader, and twice in the KDE desktop? Anyone else had this experience?

Still, it feels like I still can’t reach an acceptable brightness value. My TV is an LG C1, which has about 720 nits maximum brightness on a 10% window. Am I doing anything wrong while configuring this shader, or is this to be expected on this TV? Help is appreciated! :slight_smile:

(I’m using the crt-sony-megatron-aeg-CTV-4800-VT-hdr.slangp preset with everything at defaults except the brightness settings.)

1 Like

Sharing photos of the display showing the issue, really helps much more than text descriptions of a visual problem.

The first person I heard from who tried HDR in Linux ended up discovering some bugs in the Linux HDR implementation. Make sure you’re seeing the changes in real-time when you adjust any of these HDR settings.

It’s possible that the default presets you’re trying could look washed out on your display. If all else fails, you can try increasing the Saturation after you get your brightness looking alright.

One thing to remember is that those Peak HDR Brightness measurements for the TV all use the white subpixel. RGB based presets in Sony Megatron don’t use the white subpixel at all so your effective Peak Luminance might actually be a little lower.

For WOLED displays, use the RWBG/WOLED Display’s Subpixel Layout, Set Colour Accurate/Mask Accurate to Mask Accurate and you can try Vivid Mode in the Shader.

Don’t be surprised if you have to set your Paper White Luminance values almost or just as high as your Peak Luminance to get acceptable brightness and don’t be surprised if you have to increase your Saturation a bit before your colours start to look accurate, in particular Red.

Different Phosphor Types also have vastly different Saturation levels.

Feel free to try Sony Megatron preset packs. You might actually have an easier time getting them to look right on your OLED Display as most of them were made using an LG OLED Display.

Yes! This sounds like a very plausible explanation. Indeed the white pixel is by far the most powerful on these types of WOLED displays. Seeing as Megatron only used the RGB ones, naturally I would need to set the peak luminance value quite lower than the TV’s maximum.

Indeed at lower brightness levels the colors look totally fine in the shader, and it’s only when I try to compensate for the low brightness (by raising the paper-white luminance) that the colors start looking wrong. So I might be hitting my TV’s RGB pixels peak brightness without reaching a satisfactory brightness level for the shader for normal usage.

In this case I might need to use a shader with mask compensation features, like Guest’s and use HDR to boost overall brightness.

Yes, I am using the RWBG subpixel layout.
I will try the other things you suggested and see how close I can get to an acceptable picture. Thank you!

Ah yes! Good idea! I will try your Megatron presets and see what I can get out of those :slight_smile:
I will report back.

1 Like

When I used to sit far from my OLED TV I used to use 630/630 for Peak/Paper White Luminance.

When I sat closer, I started using 630/450.

I’ve recently observed that decreasing Viewport Size to 6X provides a nice increase in brightness over 8X and higher.

All else being equal, Shadow Mask and Aperture Grille presets tend to be brighter than Slot Mask Presets.

You should be able to get good enough brightness from an LG C1.

For a hybrid approach though, you can check out my CyberLab Mega Bezel HDR Ready presets.

1 Like

When trying to compile the .slang shader manually and trying to use crt-sony-megatron.slang alone without the other shaders it gives a failed to apply.

crt-sony-megatron.slang calls back to crt-sony-megatron-source-pass.slang (as alias SourceSDR) and crt-sony-megatron-hdr-pass.slang (as alias SourceHDR).

You can replace those with instances of stock.slang, but keep the appropriate aliases, and it will load. You would need convert the image into HDR by some other means tho, otherwise the result is extremely washed out.

What exactly are you trying to do here?

2 Likes

I have been tinkering with shader glass in tainted grail fall of Avalon and found the Megatron preset is great in offering an atmosphere that suits this game reminding me of my old crt Panasonic gaming memories. There are areas in the game extremely red that I can’t barely see anything while using the aforementioned settings.

Did you know that there’s a Reshade version of Sony Megatron Colour Video Monitor?

There’s also this:

Post some photos of the screen so we can see.

What settings?

I think I’m ready to have a play around with this. Does anyone want to try before I get a chance?

Where can this gamma cut-off point shader argument be found?

Also what can be done to ensure that the horizontal slots of the slot mask don’t get in the way as they’re not supposed to be affected or visible in this scenario?

That would be the “Inverse Gamma Cutoff (x1000)” setting, at the tail end of the CRT Settings section in the Shader Parameters.

Keep in mind that it’s not going to have nearly as profound an effect as you want it to tho. As MajorPainTheCactus mentioned, a properly set up CRT’s illuminated shadow mask and imperfect blacks aren’t so noticeable in actual use. What you are seeing is a camera artifact, similar to the rolling/hum bars that appear when a CRT is filmed at an unmatched frame rate/shutter speed.

3 Likes

Thank you very much Azurfel.

I’m not looking to create a profound effect actually.

When I use the Base Black Mask feature in CRT-Guest-Advanced, it’s extremwly subtle as is the noise effect in my Mega Bezel and the vast majority of my Mega Bezel Presets. So this is exactly what I would hope to reproduce.

Yeah, I understand what you’re saying but it is there in the case of the former but you’d never pick it up in a Sony Megatron photo unless we set iy up and emulate it. You will see it on CRT photos though.

Update: I’ve tried the full range available in the Shader Parameters for the “Inverse Gamma Cutoff (x1000)” and I can see no change in anything.

AzMods20251014

Fully updated for crt-guest-advanced-2025-10-12-release1 (rev 3/final), with a decoupled version of the new PAL shader (crt-guest-advanced-pal-decoupled), and new PAL presets made to take advantage of guest.r’s most recent work.

3 Likes

AzMods20251018

Readme with additional details

Built on top of:

  • Image Adjustment (2024.02.07) by hunterk

  • crt-guest-advanced-2025-10-18-release1 (rev5) by guest.r and Dr. Venom, based in part on Themaister’s NTSC shader

  • Sony Megatron Colour Video Monitor (2023.10.09) by MajorPainTheCactus

  • with additional functions based on or inspired by Grade by Dogway

All included “CRT Megatron” presets are currently tuned for the LG C1 and other similar displays (WRGB subpixel layout, with approximately 800 nits Peak Luminance).

The included version of Image Adjustment has been modified to allow for finer control of the effects as i found appropriate.

crt-guest-advanced-ntsc has been modified to add or expand the following options

GPGX MS color fix

Corrects Genesis Plus GX’s Master System color output, which includes minor errors i discovered while implementing the Sega MS Nonlinear Blue Fix.

  • 0=off
  • 1=on (color saturation scaled to a maximum value of RGB 255)
  • 2=sat239 (scaled to a maximum value of RGB 239)
  • 3=sat210 (scaled to a maximum value of RGB 210)
  • 4=sat165 (scaled to a maximum value of RGB 165)

Sega MS Nonlinear Blue Fix

An implementation of the behavior described in Notes & Measures: Nonlinear Blue on Sega Master System 1 & Other Findings by bfbiii.

This setting automatically adjusts to work with the GPGX MS color fix settings.

Sega MD RGB Palette

An implementation/approximation of the Mega Drive/Genesis RGB palette as discussed here.

Downsample Pseudo Hi-Res

As i understand it, 15KHz CRT displays would treat double-horizontal resolution modes (512x224, 640x240, etc) as tho they were not doubled, resulting in a blending effect, called pseudo hi-res. A number of SFC/SNES games are known to have used this behavior for transparency effects, including Breath of Fire II, Jurassic Park, and Kirby’s Dream Land 3, and as far as i know it is the correct behavior for any device originally meant to be displayed on a 15KHz CRT TV/monitor.

  • 1 = off

  • 2 = Triggers the blending effect whenever the horizontal resolution is more than twice the vertical resolution. This works well with cores that either always output a pseudo hi-res image for compatibility (such as bsnes-jg), or cores that only use pseudo hi-res for pseudo hi-res content (such as SwanStation). True high-resolution/interlaced content is not effected.

  • 3 = Triggers the blending effect whenever the horizontal resolution is 480 or higher. This is needed for cores that display pseudo hi-res content in a true high-resolution container (such as Mesen-S and a number of bsnes variants). Unfortunately, this halves the resolution of true high-resolution/interlaced content, as there is no way to differentiate pseudo hi-res and true high-resolution/interlaced content in these cores.

Internal Resolution

Modified to allow up to 1/16th downsampling. (It’s a surprise tool that will help us later.)

Sony Megatron has been modified to add or expand the following options

HDR: Content Color Gamut

Out of the box, RetroArch and Megatron clamp colors to the Rec. 709 gamut (Expand Gamut set to Off in RetroAtch, or HDR: Original/Vivid set to Original in Megatron), or stretch that Rec. 709 gamut to an unusual non-standard gamut created by someone at Microsoft (?Chuck Walbourn?) called Expanded 709 (Expand Gamut set to On in RetroArch, or HDR: Original/Vivid set to Vivid in Megatron).

Obviously, this is undesirable, as all of the major “retro” color gamuts include colors that fall outside of Rec. 709.

Serendipitously, i found that it was possible to turn this problem into it’s own solution by simply adding additional color gamuts to the “HDR: Original/Vivid”, renaming it “HDR: Content Color Gamut” to better reflect it’s newfound purpose.

When using this setting, Colour System should be set to 0/r709, and Phosphors should be set to 0/NONE.

Options are as follows:

  • 0=Rec 709/sRGB (SDR HDTV/Windows gamut)
  • 1=Expanded 709
  • 2=NTSC 1953 (The OG color system that was only really used for like 5-8ish years back when basically no one owned a color TV anyway. If you are Brazillian or from a SECAM region, it may also match some old CRT TVs you’ve used with really weirdly intense greens? Hard to say. This sort of thing is kind of underdocumented.)
  • 3=RCA 1958 (?1961?) (Millennial’s grandparent’s old TV with weird colors #1.)
  • 4=RCA 1964 (Millennial’s grandparent’s old TV with weird colors #2.)
  • 5=SMPTE C/Rec 601-525 line/Conrac (Baseline standard gamut for Analog NTSC.)
  • 6=PAL/Rec 601-625 line (Baseline standard gamut for Analog PAL.)
  • 7=Dogway’s NTSC-J (Baseline standard gamut for Analog NTSC-J.)
  • 8=P22_80s (Dogway’s Grade gamut for 1980s-early 1990s TVs.)
  • 9=Apple RGB/Trinitron PC (Should approximate basically any Trinitron monitor from 1987-the mid to late 1990s. By the early 00s, they were SMPTE C instead, at least for high end monitors like the FW900.)
  • 10=guest’s Philips PC (Gamut used by a number of extremely popular monitors that used Philips tubes, including Philips CM8533, Philips VS-0080, and Commodore 1084)
  • 11=P22_90s (Dogway’s Grade gamut for mid 1990s TVs with tinted phosphors.)
  • 12=RPTV_95s (Dogway’s Grade gamut for late 90s/early 00s rear projection TVs that game manuals said you shouldn’t play games on due to burn in risk.)
  • 13=Display P3/P3-D65 (Common wide color gamut. Variant on the gamut used for film with shared primaries. Might be useful in the future if someone makes a WCG pixel game that looks best with a CRT shader?)
  • 14=Rec 2020 (HDR gamut. Again, might be useful in the future if someone makes a WCG pixel game that looks best with a CRT shader.)

Gamut Overshoot Fix

A fix MajorPainTheCactus came up with to deal with the color errors i noticed using lilium’s HDR analysis shader. (Sony Megatron Colour Video Monitor)

He decided not to implement it at the time, as he didn’t think it wouldn’t make a perceptible difference, but a friend and i both came to the conclusion that it makes certain test pattern colors look more like they do with no shaders applied, so i have continued to use it. There should be no downside. (Sony Megatron Colour Video Monitor)

  • 0=no fix
  • 1=the 7-decimal clamp
  • 2=the 6-decimal clamp (results in non-zero blacks, perceptible in a sufficiently dark room)

Scanline Multiplier/Divisor

Multiplies or divides the number of scanlines.

Useful for cases like DOS games meant for 31KHz displays, which output 320x200 that was then double scanned to 640x400.

The divisor options are handy for displaying increased internal resolution output from 3D cores that don’t include a good downsample option, such as PPSSPP. I strongly recommend using this setting in conjuction with crt-guest-advanced-ntsc’s Internal Resolution setting to reduce shimmering.

  • -1,0,1=1x (Default/off)
  • 2=2x (Doublescan)
  • 3=Auto 2x (Automatically doublescans any content with a vertical resolution of less than 300, while leaving higher resolution content unchanged)
  • -2=1/2 (Reduces the scanline count to one half default)
  • -3=1/3 (Reduces the scanline count to one third default)
  • -4=1/4 (Reduces the scanline count to one quarter default)
  • -5=1/5 (etc.)
  • -6=1/6
  • -7=1/7
  • -8=1/8
  • -9=1/9
  • -10=1/10
  • -11=1/11
  • -12=1/12
  • -13=1/13
  • -14=1/14
  • -15=1/15
  • -16=1/16
4 Likes