Sony Megatron Colour Video Monitor

Oh yeah, that’s why I said ‘Obviously it varies from display to display and signal to signal’, I meant that in the context of CRTs. I just wondered what’d be more representative of an average experience on a general consumer CRT setup, I’m not looking for a PVM experience or anything. As far as just a standard, middle of the road setup, I figured the presentation would be somewhat comparable between sets once you’re actually sat back at a normal distance. Sure TVL and mask type and all that will influence things to some degree, but would it affect things to the point of the game’s resolution itself varying that dramatically? That’s what I wondered, is if some CRTs at 480i are giving a comparable image to 720P on an emulator. You say both could be just as accurate so I’m assuming the answer is ‘yes’.

A CRT isn’t magically going to make the 512x448 or 640x448 output of the PS2 look like the 1024x896 or 1280x896 that PCSX2 is outputting at “720p” internal resolution.

The amount of aliasing may be more perceptually similar in some ways i guess depending on the CRT TV? But PCSX2 is going to be putting out significantly greater detail at those settings, no ifs ands, or buts. We are talking double the resolution here, so 4x the pixel density.

1 Like

Afer doing some tests and comparing, I could see why someone would consider a 2x resolution, blown up PCSX2 picture closer a CRT TV versus native. But once CRT shaders come into play, this should be negated, at least for the majority of graphics. Maybe it’s useful for the mentioned UI issues, rendering fonts better if you look closely, whatever, things like that.

2 Likes

Ah, thank you very much for giving it a go. You have my gratitude and have put my curiosities to rest. In-person really is the only way to tell with stuff like that, since pictures from someone’s phone (which is all I have access to for comparison) never paints the full picture.

Hey, Major. Using your crt-sony-megatron-default-hdr.slangp preset turns in results like what I’ve attached. I’m displaying it on my 4K, 800 nit OLED, and while I know that using shader masks will block out a percentage of light, I feel like what I’m getting isn’t normal. RetroArch is set to HDR, Windows is calibrated with Auto HDR disabled…I don’t know what else to include.

Are Retroarch’s menus displaying in HDR?

And is the resolution set to 2160p? (Just checking since that screenshot is 1080p)

Yes, and yes. Man, that menu text is blinding. :dizzy_face: The screenshot I took was too big for the forum, so I had to shrink it.

So despite being called “default”, that preset is kind of wonky. I have a testing fork that includes more refined presets built specifically for LG C1s and other 800 nit LG panels (most recent version here). Make sure you have HGiG enabled in your display’s settings.

If you would rather tweak settings yourself: go into Shader Parameters, set HDR: Display’s Peak Luminance to 800, and HDR: Display’s Paper White Luminance to 650-670ish. Again, make sure you have HGiG enabled in your display’s settings.

You should also set “Display’s Subpixel Layout” to match your panel. (That would be RWBG if you have an LG panel other than the G5, which uses a new BWRG layout that isn’t supported in Megatron at the moment).

The color balance on that preset is also kind of red tinged. You can fix this by making all of the “Scanline Min” settings match, then making all the “Scanline Max” settings match.

1 Like

After you do all that @Azure suggested you can also try out my CyberLab Megatron Death To Pixels Shader Preset packs, most of which were developed on an LG OLED E6P.

I’m not sure if you missed the Sony Megatron Colour Video Monitor setup instructions or if you haven’t updated your Slang Shaders in a while but you’re supposed to go into Shader Parameters and adjust the Peak and Paper White Luminance Values until things look correct, including bright enough on your particular display.

When trying to convey what you’re experiencing using Sony Megatron Colour Video Monitor a photo or video of the screen might be more meaningful.

Here’s my results using your CRT Megatron RGB 709 (Aperture grille) [2160p].slangp after implementing your suggestions.

@Cyber I’ve tried out your shader. I’m pretty OCD about not having redundant settings, my preference is a preset that only does what I need it to do. Funny thing is, the more I mess around with these preset the more I’m confused by it all. I’m willing to just suck it up.

When I zoom in on the picture I posted, each individual BGR triad seems to be at peak luminance, but when I view it at its original size it’s washed out. I can try to compensate for it, but it looks worse the further I push shader parameters. The punch I’m expecting to see is what the game looks like when setting Display’s Resolution to 1080p, which looks dang near perfect but destroys the phosphor pattern.

What do you need a preset to do and what redundant settings are you talking about?

You’re probably doing too much. Maybe you should give a step by step of what you’re doing to get to this point.

If you go back and read this thread you’ll see that it’s not supposed to be that hard. You load up a preset, you go into Shader Parameters and you increase your Peak Luminance and Paper White Luminance.

What do you have your Peak.and Paper White Luminance set to?

Try 630 for both and see if it looks better.

Also, what source device are you running RetroArch on and how is that setup?

Are you outputting RGB 4:4:4 Full 8bit/10bit/12bit colour?

Did you change your HDMI input label to PC?

Did you enable HDMI Ultra Deep Colour on that input?

Are you using HDR Game Mode?

These are some examples of how my presets look:

Potential Spoiler Alert

This sounds like you aren’t getting 444 chroma.

If you are using an LG TV, you need to set the HDMI icon to PC and enable HDMI Deep Color (General>Devices>HDMI Settings on an LG C1), in addition to having 444 enabled system side.

1 Like

Pardon if this was talked before in the thread, but a quick search turned up nothing.

I wanted to try this amazing shader now that HDR works on the major Linux desktops (I’m using the KDE Plasma desktop). KDE has a neat HDR calibration tool in the monitor settings, where you set your max luminance and paper-white value. The usual stuff.

After setting those values up, I set the same inside RetroArch on the HDR menu, and then I set them again on the Megatron parameters (so, I set them in 3 places in total).

But I noticed the colors are washed out and the brightest values seem to be clipping into each other. The overall picture looks very wrong.

Then I went again to the KDE monitor settings and set the HDR max luminance to it’s maximum allowed value (2000 nits), and now Megatron looks much much nicer!
Perhaps double tone-mapping is happening? Once in the shader, and twice in the KDE desktop? Anyone else had this experience?

Still, it feels like I still can’t reach an acceptable brightness value. My TV is an LG C1, which has about 720 nits maximum brightness on a 10% window. Am I doing anything wrong while configuring this shader, or is this to be expected on this TV? Help is appreciated! :slight_smile:

(I’m using the crt-sony-megatron-aeg-CTV-4800-VT-hdr.slangp preset with everything at defaults except the brightness settings.)

1 Like

Sharing photos of the display showing the issue, really helps much more than text descriptions of a visual problem.

The first person I heard from who tried HDR in Linux ended up discovering some bugs in the Linux HDR implementation. Make sure you’re seeing the changes in real-time when you adjust any of these HDR settings.

It’s possible that the default presets you’re trying could look washed out on your display. If all else fails, you can try increasing the Saturation after you get your brightness looking alright.

One thing to remember is that those Peak HDR Brightness measurements for the TV all use the white subpixel. RGB based presets in Sony Megatron don’t use the white subpixel at all so your effective Peak Luminance might actually be a little lower.

For WOLED displays, use the RWBG/WOLED Display’s Subpixel Layout, Set Colour Accurate/Mask Accurate to Mask Accurate and you can try Vivid Mode in the Shader.

Don’t be surprised if you have to set your Paper White Luminance values almost or just as high as your Peak Luminance to get acceptable brightness and don’t be surprised if you have to increase your Saturation a bit before your colours start to look accurate, in particular Red.

Different Phosphor Types also have vastly different Saturation levels.

Feel free to try Sony Megatron preset packs. You might actually have an easier time getting them to look right on your OLED Display as most of them were made using an LG OLED Display.

Yes! This sounds like a very plausible explanation. Indeed the white pixel is by far the most powerful on these types of WOLED displays. Seeing as Megatron only used the RGB ones, naturally I would need to set the peak luminance value quite lower than the TV’s maximum.

Indeed at lower brightness levels the colors look totally fine in the shader, and it’s only when I try to compensate for the low brightness (by raising the paper-white luminance) that the colors start looking wrong. So I might be hitting my TV’s RGB pixels peak brightness without reaching a satisfactory brightness level for the shader for normal usage.

In this case I might need to use a shader with mask compensation features, like Guest’s and use HDR to boost overall brightness.

Yes, I am using the RWBG subpixel layout.
I will try the other things you suggested and see how close I can get to an acceptable picture. Thank you!

Ah yes! Good idea! I will try your Megatron presets and see what I can get out of those :slight_smile:
I will report back.

1 Like

When I used to sit far from my OLED TV I used to use 630/630 for Peak/Paper White Luminance.

When I sat closer, I started using 630/450.

I’ve recently observed that decreasing Viewport Size to 6X provides a nice increase in brightness over 8X and higher.

All else being equal, Shadow Mask and Aperture Grille presets tend to be brighter than Slot Mask Presets.

You should be able to get good enough brightness from an LG C1.

For a hybrid approach though, you can check out my CyberLab Mega Bezel HDR Ready presets.

1 Like

When trying to compile the .slang shader manually and trying to use crt-sony-megatron.slang alone without the other shaders it gives a failed to apply.

crt-sony-megatron.slang calls back to crt-sony-megatron-source-pass.slang (as alias SourceSDR) and crt-sony-megatron-hdr-pass.slang (as alias SourceHDR).

You can replace those with instances of stock.slang, but keep the appropriate aliases, and it will load. You would need convert the image into HDR by some other means tho, otherwise the result is extremely washed out.

What exactly are you trying to do here?

2 Likes

I have been tinkering with shader glass in tainted grail fall of Avalon and found the Megatron preset is great in offering an atmosphere that suits this game reminding me of my old crt Panasonic gaming memories. There are areas in the game extremely red that I can’t barely see anything while using the aforementioned settings.

Did you know that there’s a Reshade version of Sony Megatron Colour Video Monitor?

There’s also this:

Post some photos of the screen so we can see.

What settings?