Sony Megatron Colour Video Monitor

Is that in reply to me?

I’m asking if you try to replicate how 480i looks on a PS2 with a CRT as closely as possible, does running an emulated game with a 480p shader look more accurate if the internal res is 480p or 720p? Off number-matching alone I’d have thought it should be the 480p one, but I’m curious since that comment I read said 720p looked closer. I just figured a sample size of ‘1’ isn’t exactly big, so I was curious to see what other people would make of it who have the means to do the same test. Say you run this shader through reshade and swap the game’s resolution between 480p and 720p on an oled or whatever, with an actual PS2 and CRT running the same game at native res next to it, which looks like a closer match to the real PS2 on a real CRT? The 480p emulation, or the 720p one?

Someone who made progressive patches recommended to increase internal resolution for games that absolutely need to be deinterlaced, so maybe this in reference to this. However, this also would hint that it’s very game dependent. For many games, deinterlacing isn’t relevant, they can render fine in progressive if you force them into that on real hardware. Hard to see how these games won’t look rather inaccurate if you double the native resolution (in practice the “720p” setting should result in 896 lines internally most of the time, on PAL, optimized games typically use 512 lines natively, so doubling already brings you close to 1080p…)

I could test some games on real hardware and CRT TV versus raw PCSX output (monitor isn’t really suited for slot mask TV mimicking attempts).

I’d be very appreciative if you ever decide to do that. As someone who’s gone back and forth between 480p and 720p in PCSX2, without that native stuff for comparison, I feel like I can see arguments for both being more accurate, but obviously it’s all assumptions I’m making by feel.

On the one hand, 720p sometimes looks maybe a little too pristine, to where I’m like ‘Surely this game at 480i/p on native hardware doesn’t look this clean?’, but at others, when running at 480p, some games make me think ‘This image looks too rough in a way that doesn’t feel right’. Some games look better than others; I’ve played some fighting games at 480p and thought ‘This seems fine ’, but then I play a game like Burnout 3 and it can be kinda hard to see oncoming traffic and stuff properly because everything’s so low-res looking.

Obviously it varies from display to display and signal to signal, but I know a lot of people say a CRT aids a lot of 480 games quite noticeably with an almost natural, built-in anti-aliasing, which I don’t feel like shaders particularly provide much of at that resolution. To text and stuff, yes, but not really the 3D assets. But if you boost the resolution a bit, keeping the shader to how it should look on 480p content, obviously brute-forcing more pixels means less aliasing, hence I can see how, in theory, that might more closely mirror how the proper hardware looks, even if the resolutions between the two aren’t the same.

I wouldn’t even NEED it to be a slot mask you use with a shader, the mask won’t really affect that, so if an aperture grille or something’s more convenient then no worries. All the test would be is having that set up and going back and forth between 480 and 720, then looking over at the real deal and seeing which is a nearer match.

Also damn it I keep forgetting to post these as replies and having to delete my posts.

Do you know that CRTs and their screens and masks came in all different shapes and sizes, right? Both could be just as accurate because CRTs varied quite a bit.

Oh yeah, that’s why I said ‘Obviously it varies from display to display and signal to signal’, I meant that in the context of CRTs. I just wondered what’d be more representative of an average experience on a general consumer CRT setup, I’m not looking for a PVM experience or anything. As far as just a standard, middle of the road setup, I figured the presentation would be somewhat comparable between sets once you’re actually sat back at a normal distance. Sure TVL and mask type and all that will influence things to some degree, but would it affect things to the point of the game’s resolution itself varying that dramatically? That’s what I wondered, is if some CRTs at 480i are giving a comparable image to 720P on an emulator. You say both could be just as accurate so I’m assuming the answer is ‘yes’.

A CRT isn’t magically going to make the 512x448 or 640x448 output of the PS2 look like the 1024x896 or 1280x896 that PCSX2 is outputting at “720p” internal resolution.

The amount of aliasing may be more perceptually similar in some ways i guess depending on the CRT TV? But PCSX2 is going to be putting out significantly greater detail at those settings, no ifs ands, or buts. We are talking double the resolution here, so 4x the pixel density.

1 Like

Afer doing some tests and comparing, I could see why someone would consider a 2x resolution, blown up PCSX2 picture closer a CRT TV versus native. But once CRT shaders come into play, this should be negated, at least for the majority of graphics. Maybe it’s useful for the mentioned UI issues, rendering fonts better if you look closely, whatever, things like that.

2 Likes

Ah, thank you very much for giving it a go. You have my gratitude and have put my curiosities to rest. In-person really is the only way to tell with stuff like that, since pictures from someone’s phone (which is all I have access to for comparison) never paints the full picture.

Hey, Major. Using your crt-sony-megatron-default-hdr.slangp preset turns in results like what I’ve attached. I’m displaying it on my 4K, 800 nit OLED, and while I know that using shader masks will block out a percentage of light, I feel like what I’m getting isn’t normal. RetroArch is set to HDR, Windows is calibrated with Auto HDR disabled…I don’t know what else to include.

Are Retroarch’s menus displaying in HDR?

And is the resolution set to 2160p? (Just checking since that screenshot is 1080p)

Yes, and yes. Man, that menu text is blinding. :dizzy_face: The screenshot I took was too big for the forum, so I had to shrink it.

So despite being called “default”, that preset is kind of wonky. I have a testing fork that includes more refined presets built specifically for LG C1s and other 800 nit LG panels (most recent version here). Make sure you have HGiG enabled in your display’s settings.

If you would rather tweak settings yourself: go into Shader Parameters, set HDR: Display’s Peak Luminance to 800, and HDR: Display’s Paper White Luminance to 650-670ish. Again, make sure you have HGiG enabled in your display’s settings.

You should also set “Display’s Subpixel Layout” to match your panel. (That would be RWBG if you have an LG panel other than the G5, which uses a new BWRG layout that isn’t supported in Megatron at the moment).

The color balance on that preset is also kind of red tinged. You can fix this by making all of the “Scanline Min” settings match, then making all the “Scanline Max” settings match.

1 Like

After you do all that @Azure suggested you can also try out my CyberLab Megatron Death To Pixels Shader Preset packs, most of which were developed on an LG OLED E6P.

I’m not sure if you missed the Sony Megatron Colour Video Monitor setup instructions or if you haven’t updated your Slang Shaders in a while but you’re supposed to go into Shader Parameters and adjust the Peak and Paper White Luminance Values until things look correct, including bright enough on your particular display.

When trying to convey what you’re experiencing using Sony Megatron Colour Video Monitor a photo or video of the screen might be more meaningful.

Here’s my results using your CRT Megatron RGB 709 (Aperture grille) [2160p].slangp after implementing your suggestions.

@Cyber I’ve tried out your shader. I’m pretty OCD about not having redundant settings, my preference is a preset that only does what I need it to do. Funny thing is, the more I mess around with these preset the more I’m confused by it all. I’m willing to just suck it up.

When I zoom in on the picture I posted, each individual BGR triad seems to be at peak luminance, but when I view it at its original size it’s washed out. I can try to compensate for it, but it looks worse the further I push shader parameters. The punch I’m expecting to see is what the game looks like when setting Display’s Resolution to 1080p, which looks dang near perfect but destroys the phosphor pattern.

What do you need a preset to do and what redundant settings are you talking about?

You’re probably doing too much. Maybe you should give a step by step of what you’re doing to get to this point.

If you go back and read this thread you’ll see that it’s not supposed to be that hard. You load up a preset, you go into Shader Parameters and you increase your Peak Luminance and Paper White Luminance.

What do you have your Peak.and Paper White Luminance set to?

Try 630 for both and see if it looks better.

Also, what source device are you running RetroArch on and how is that setup?

Are you outputting RGB 4:4:4 Full 8bit/10bit/12bit colour?

Did you change your HDMI input label to PC?

Did you enable HDMI Ultra Deep Colour on that input?

Are you using HDR Game Mode?

These are some examples of how my presets look:

Potential Spoiler Alert

This sounds like you aren’t getting 444 chroma.

If you are using an LG TV, you need to set the HDMI icon to PC and enable HDMI Deep Color (General>Devices>HDMI Settings on an LG C1), in addition to having 444 enabled system side.

1 Like

Pardon if this was talked before in the thread, but a quick search turned up nothing.

I wanted to try this amazing shader now that HDR works on the major Linux desktops (I’m using the KDE Plasma desktop). KDE has a neat HDR calibration tool in the monitor settings, where you set your max luminance and paper-white value. The usual stuff.

After setting those values up, I set the same inside RetroArch on the HDR menu, and then I set them again on the Megatron parameters (so, I set them in 3 places in total).

But I noticed the colors are washed out and the brightest values seem to be clipping into each other. The overall picture looks very wrong.

Then I went again to the KDE monitor settings and set the HDR max luminance to it’s maximum allowed value (2000 nits), and now Megatron looks much much nicer!
Perhaps double tone-mapping is happening? Once in the shader, and twice in the KDE desktop? Anyone else had this experience?

Still, it feels like I still can’t reach an acceptable brightness value. My TV is an LG C1, which has about 720 nits maximum brightness on a 10% window. Am I doing anything wrong while configuring this shader, or is this to be expected on this TV? Help is appreciated! :slight_smile:

(I’m using the crt-sony-megatron-aeg-CTV-4800-VT-hdr.slangp preset with everything at defaults except the brightness settings.)

1 Like

Sharing photos of the display showing the issue, really helps much more than text descriptions of a visual problem.

The first person I heard from who tried HDR in Linux ended up discovering some bugs in the Linux HDR implementation. Make sure you’re seeing the changes in real-time when you adjust any of these HDR settings.

It’s possible that the default presets you’re trying could look washed out on your display. If all else fails, you can try increasing the Saturation after you get your brightness looking alright.

One thing to remember is that those Peak HDR Brightness measurements for the TV all use the white subpixel. RGB based presets in Sony Megatron don’t use the white subpixel at all so your actual effective Peak Luminance might actually be a little lower.

For WOLED displays, use the RWBG/WOLED Display’s Subpixel Layout, Set Colour Accurate/Mask Accurate to Mask Accurate and you can try Vivid Mode in the Shader.

Don’t be surprised if you have to set your Paper White Luminance values almost or just as high as your Peak Luminance to get acceptable brightness and don’t be surprised if you have to increase your Saturation a bit before your colours start to look accurate, in particular Red.

Different Phosphor Types also have vastly different Saturation levels.

Feel free to try Sony Megatron preset packs. You might actually have an easier time getting them to look right on your OLED Display as most of them were made using an LG OLED Display.

Yes! This sounds like a very plausible explanation. Indeed the white pixel is by far the most powerful on these types of WOLED displays. Seeing as Megatron only used the RGB ones, naturally I would need to set the peak luminance value quite lower than the TV’s maximum.

Indeed at lower brightness levels the colors look totally fine in the shader, and it’s only when I try to compensate for the low brightness (by raising the paper-white luminance) that the colors start looking wrong. So I might be hitting my TV’s RGB pixels peak brightness without reaching a satisfactory brightness level for the shader for normal usage.

In this case I might need to use a shader with mask compensation features, like Guest’s and use HDR to boost overall brightness.

Yes, I am using the RWBG subpixel layout.
I will try the other things you suggested and see how close I can get to an acceptable picture. Thank you!

Ah yes! Good idea! I will try your Megatron presets and see what I can get out of those :slight_smile:
I will report back.

1 Like

When I used to set far from my OLED TV I used to use 630/630 for Peak/Paper White Luminance.

When I sat closer, I started using 630/450.

I’ve recently observed that decreasing Viewport Size to 6X provides a nice increase in brightness over 8X and higher.

All else being equal, Shadow Mask and Aperture Grille presets tend to be brighter than Slot Mask Presets.

You should be able to get good enough brightness from an LG C1.

For a hybrid approach though, you can check out my CyberLab Mega Bezel HDR Ready presets.