Sony Megatron Colour Video Monitor

Another thing than can help in some instances, though not sounding like it’s likely the case in yours, is changing a setting in the nvidia control panel in Manage 3d Settings (if you’re using one of their GPUs)

Vulkan/OpenGL present method - change to Prefer layered on DXGI swapchain helped my own particular issues using HDR with PCSX2

1 Like

AzMods20250531, incorporating changes from crt-guest-advanced-2025-05-31-release1, and improvements for the “IRE 7.5” presets.

edit: AzMods20250601, incorporating changes from crt-guest-advanced-2025-06-01-release1.

Readme with additional details

Built on top of:

  • Image Adjustment (2024.02.07) by hunterk

  • crt-guest-advanced-2025-05-31-release1 by guest.r and Dr. Venom, based in part on Themaister’s NTSC shader

  • Sony Megatron Colour Video Monitor (2023.10.09) by MajorPainTheCactus

  • with additional functions based on or inspired by Grade by Dogway

All included “CRT Megatron” presets are currently tuned for the LG C1 and other similar displays (WRGB subpixel layout, with approximately 800 nits Peak Luminance).

The included version of Image Adjustment has been modified to allow for finer control of the effects as i found appropriate.

crt-guest-advanced-ntsc has been modified to add or expand the following options

GPGX MS color fix

Corrects Genesis Plus GX’s Master System color output, which includes minor errors i discovered while implementing the Sega MS Nonlinear Blue Fix.

  • 0=off
  • 1=on (color saturation scaled to a maximum value of RGB 255)
  • 2=sat239 (scaled to a maximum value of RGB 239)
  • 3=sat210 (scaled to a maximum value of RGB 210)
  • 4=sat165 (scaled to a maximum value of RGB 165)

Sega MS Nonlinear Blue Fix

An implementation of the behavior described in Notes & Measures: Nonlinear Blue on Sega Master System 1 & Other Findings by bfbiii.

This setting automatically adjusts to work with the GPGX MS color fix settings.

Sega MD RGB Palette

An implementation/approximation of the Mega Drive/Genesis RGB palette as discussed here.

Downsample Pseudo Hi-Res

As i understand it, 15KHz CRT displays would treat double-horizontal resolution modes (512x224, 640x240, etc) as tho they were not doubled, resulting in a blending effect, called pseudo hi-res. A number of SFC/SNES games are known to have used this behavior for transparency effects, including Breath of Fire II, Jurassic Park, and Kirby’s Dream Land 3, and as far as i know it is the correct behavior for any device originally meant to be displayed on a 15KHz CRT TV/monitor.

  • 1 = off

  • 2 = Triggers the blending effect whenever the horizontal resolution is more than twice the vertical resolution. This works well with cores that either always output a pseudo hi-res image for compatibility (such as bsnes-jg), or cores that only use pseudo hi-res for pseudo hi-res content (such as SwanStation). True high-resolution/interlaced content is not effected.

  • 3 = Triggers the blending effect whenever the horizontal resolution is 480 or higher. This is needed for cores that display pseudo hi-res content in a true high-resolution container (such as Mesen-S and a number of bsnes variants). Unfortunately, this halves the resolution of true high-resolution/interlaced content, as there is no way to differentiate pseudo hi-res and true high-resolution/interlaced content in these cores.

Internal Resolution

Modified to allow up to 1/16th downsampling. (It’s a surprise tool that will help us later.)

Sony Megatron has been modified to add or expand the following options

HDR: Content Color Gamut

Out of the box, RetroArch and Megatron clamp colors to the Rec. 709 gamut (Expand Gamut set to Off in RetroAtch, or HDR: Original/Vivid set to Original in Megatron), or stretch that Rec. 709 gamut to an unusual non-standard gamut created by someone at Microsoft (?Chuck Walbourn?) called Expanded 709 (Expand Gamut set to On in RetroArch, or HDR: Original/Vivid set to Vivid in Megatron).

Obviously, this is undesirable, as all of the major “retro” color gamuts include colors that fall outside of Rec. 709.

Serendipitously, i found that it was possible to turn this problem into it’s own solution by simply adding additional color gamuts to the “HDR: Original/Vivid”, renaming it “HDR: Content Color Gamut” to better reflect it’s newfound purpose.

When using this setting, Colour System should be set to 0/r709, and Phosphors should be set to 0/NONE.

Options are as follows:

  • 0=Rec 709/sRGB (SDR HDTV/Windows gamut)
  • 1=Expanded 709
  • 2=NTSC 1953 (The OG color system that was only really used for like 5-8ish years back when basically no one owned a color TV anyway. If you are Brazillian or from a SECAM region, it may also match some old CRT TVs you’ve used with really weirdly intense greens? Hard to say. This sort of thing is kind of underdocumented.)
  • 3=RCA 1958 (?1961?) (Millennial’s grandparent’s old TV with weird colors #1.)
  • 4=RCA 1964 (Millennial’s grandparent’s old TV with weird colors #2.)
  • 5=SMPTE C/Rec 601-525 line/Conrac (Baseline standard gamut for Analog NTSC.)
  • 6=PAL/Rec 601-625 line (Baseline standard gamut for Analog PAL.)
  • 7=Dogway’s NTSC-J (Baseline standard gamut for Analog NTSC-J.)
  • 8=P22_80s (Dogway’s Grade gamut for 1980s-early 1990s TVs.)
  • 9=Apple RGB/Trinitron PC (Should approximate basically any Trinitron monitor from 1987-the mid to late 1990s. By the early 00s, they were SMPTE C instead, at least for high end monitors like the FW900.)
  • 10=guest’s Philips PC (Gamut used by a number of extremely popular monitors that used Philips tubes, including Philips CM8533, Philips VS-0080, and Commodore 1084)
  • 11=P22_90s (Dogway’s Grade gamut for mid 1990s TVs with tinted phosphors.)
  • 12=RPTV_95s (Dogway’s Grade gamut for late 90s/early 00s rear projection TVs that game manuals said you shouldn’t play games on due to burn in risk.)
  • 13=Display P3/P3-D65 (Common wide color gamut. Variant on the gamut used for film with shared primaries. Might be useful in the future if someone makes a WCG pixel game that looks best with a CRT shader?)
  • 14=Rec 2020 (HDR gamut. Again, might be useful in the future if someone makes a WCG pixel game that looks best with a CRT shader.)

Gamut Overshoot Fix

A fix MajorPainTheCactus came up with to deal with the color errors i noticed using lilium’s HDR analysis shader. (Sony Megatron Colour Video Monitor)

He decided not to implement it at the time, as he didn’t think it wouldn’t make a perceptible difference, but a friend and i both came to the conclusion that it makes certain test pattern colors look more like they do with no shaders applied, so i have continued to use it. There should be no downside. (Sony Megatron Colour Video Monitor)

  • 0=no fix
  • 1=the 7-decimal clamp
  • 2=the 6-decimal clamp (results in non-zero blacks, perceptible in a sufficiently dark room)

Scanline Multiplier/Divisor

Multiplies or divides the number of scanlines.

Useful for cases like DOS games meant for 31KHz displays, which output 320x200 that was then double scanned to 640x400.

The divisor options are handy for displaying increased internal resolution output from 3D cores that don’t include a good downsample option, such as PPSSPP. I strongly recommend using this setting in conjuction with crt-guest-advanced-ntsc’s Internal Resolution setting to reduce shimmering.

  • -1,0,1=1x (Default/off)
  • 2=2x (Doublescan)
  • 3=Auto 2x (Automatically doublescans any content with a vertical resolution of less than 300, while leaving higher resolution content unchanged)
  • -2=1/2 (Reduces the scanline count to one half default)
  • -3=1/3 (Reduces the scanline count to one third default)
  • -4=1/4 (Reduces the scanline count to one quarter default)
  • -5=1/5 (etc.)
  • -6=1/6
  • -7=1/7
  • -8=1/8
  • -9=1/9
  • -10=1/10
  • -11=1/11
  • -12=1/12
  • -13=1/13
  • -14=1/14
  • -15=1/15
  • -16=1/16
2 Likes

Got a weird and maybe stupid question which I figured I’d throw out here since this shader has been ported to Reshade and can be pretty easily used independently of render resolution.

I saw a comment from someone elsewhere which mentioned that on PCSX2, running a game at 720p actually more closely resembled how 480i looked on native hardware on their CRT, than running it at 480p did. Is there anyone who’d be open to testing this claim or has the experience to counter it outright?

Sadly I don’t have a CRT or PS2 to side-by-side with myself, but I found it an interesting claim, since if that IS the case, running in software mode would technically not be the most visually accurate 1:1 representation of how the final image would be presented. Does 720p with this shader look more like 480i on an actual CRT of a similar look does, or does 480 provide an equal image on both?

Technically only doing half the work, but I have two pictures, one at 720p, one at 480p, both emulated.

https://imgur.com/a/GKHmLpo

I also realised 480p can introduce UI issues that disappear at 720p, like this respresentation of the circle button, but I don’t know if the circle is broken like that when displayed on an actual PS2 or not.

https://imgur.com/a/MEtPV0j

Purely visually-speaking, does the emulated 480p actually look more accurate to how it should, or does it only match up on paper by numbers alone? As I understand it, emulating 480i accurately is tricky to match-up to how it’s handled by proper hardware of the time, so is 720p actually a more accurate workaround to matching that look?

what exactly are we looking for?

Is that in reply to me?

I’m asking if you try to replicate how 480i looks on a PS2 with a CRT as closely as possible, does running an emulated game with a 480p shader look more accurate if the internal res is 480p or 720p? Off number-matching alone I’d have thought it should be the 480p one, but I’m curious since that comment I read said 720p looked closer. I just figured a sample size of ‘1’ isn’t exactly big, so I was curious to see what other people would make of it who have the means to do the same test. Say you run this shader through reshade and swap the game’s resolution between 480p and 720p on an oled or whatever, with an actual PS2 and CRT running the same game at native res next to it, which looks like a closer match to the real PS2 on a real CRT? The 480p emulation, or the 720p one?

Someone who made progressive patches recommended to increase internal resolution for games that absolutely need to be deinterlaced, so maybe this in reference to this. However, this also would hint that it’s very game dependent. For many games, deinterlacing isn’t relevant, they can render fine in progressive if you force them into that on real hardware. Hard to see how these games won’t look rather inaccurate if you double the native resolution (in practice the “720p” setting should result in 896 lines internally most of the time, on PAL, optimized games typically use 512 lines natively, so doubling already brings you close to 1080p…)

I could test some games on real hardware and CRT TV versus raw PCSX output (monitor isn’t really suited for slot mask TV mimicking attempts).

I’d be very appreciative if you ever decide to do that. As someone who’s gone back and forth between 480p and 720p in PCSX2, without that native stuff for comparison, I feel like I can see arguments for both being more accurate, but obviously it’s all assumptions I’m making by feel.

On the one hand, 720p sometimes looks maybe a little too pristine, to where I’m like ‘Surely this game at 480i/p on native hardware doesn’t look this clean?’, but at others, when running at 480p, some games make me think ‘This image looks too rough in a way that doesn’t feel right’. Some games look better than others; I’ve played some fighting games at 480p and thought ‘This seems fine ’, but then I play a game like Burnout 3 and it can be kinda hard to see oncoming traffic and stuff properly because everything’s so low-res looking.

Obviously it varies from display to display and signal to signal, but I know a lot of people say a CRT aids a lot of 480 games quite noticeably with an almost natural, built-in anti-aliasing, which I don’t feel like shaders particularly provide much of at that resolution. To text and stuff, yes, but not really the 3D assets. But if you boost the resolution a bit, keeping the shader to how it should look on 480p content, obviously brute-forcing more pixels means less aliasing, hence I can see how, in theory, that might more closely mirror how the proper hardware looks, even if the resolutions between the two aren’t the same.

I wouldn’t even NEED it to be a slot mask you use with a shader, the mask won’t really affect that, so if an aperture grille or something’s more convenient then no worries. All the test would be is having that set up and going back and forth between 480 and 720, then looking over at the real deal and seeing which is a nearer match.

Also damn it I keep forgetting to post these as replies and having to delete my posts.

Do you know that CRTs and their screens and masks came in all different shapes and sizes, right? Both could be just as accurate because CRTs varied quite a bit.

Oh yeah, that’s why I said ‘Obviously it varies from display to display and signal to signal’, I meant that in the context of CRTs. I just wondered what’d be more representative of an average experience on a general consumer CRT setup, I’m not looking for a PVM experience or anything. As far as just a standard, middle of the road setup, I figured the presentation would be somewhat comparable between sets once you’re actually sat back at a normal distance. Sure TVL and mask type and all that will influence things to some degree, but would it affect things to the point of the game’s resolution itself varying that dramatically? That’s what I wondered, is if some CRTs at 480i are giving a comparable image to 720P on an emulator. You say both could be just as accurate so I’m assuming the answer is ‘yes’.

A CRT isn’t magically going to make the 512x448 or 640x448 output of the PS2 look like the 1024x896 or 1280x896 that PCSX2 is outputting at “720p” internal resolution.

The amount of aliasing may be more perceptually similar in some ways i guess depending on the CRT TV? But PCSX2 is going to be putting out significantly greater detail at those settings, no ifs ands, or buts. We are talking double the resolution here, so 4x the pixel density.

1 Like

Afer doing some tests and comparing, I could see why someone would consider a 2x resolution, blown up PCSX2 picture closer a CRT TV versus native. But once CRT shaders come into play, this should be negated, at least for the majority of graphics. Maybe it’s useful for the mentioned UI issues, rendering fonts better if you look closely, whatever, things like that.

2 Likes

Ah, thank you very much for giving it a go. You have my gratitude and have put my curiosities to rest. In-person really is the only way to tell with stuff like that, since pictures from someone’s phone (which is all I have access to for comparison) never paints the full picture.