Sony Megatron Colour Video Monitor

Hello,

Sorry if this problem has already been mentioned but I’m trying to use your shaders with my new Dell G3223D monitor and the latest stable version of Retroarch. The problem is that I have totally washed out colors with all your hdr shaders. Is this a known problem and if so do you have any idea of its origin?

Thanks for your help.

1 Like

Hi yoZe that’s probably that you’ve not turned on HDR in Windows and/or RA (Settings->Video->HDR). Also try the SDR version see if that makes things better for you.

2 Likes

@rafan I tried moving the color space conversion to before the scanline and mask generation and have had some success. There’s a problem with HDR as the values are out of whack but I’m hoping I can fix this. This’ll definitely fix the issues you’re seeing.

4 Likes

Sony Megatron V4.0

Just submitted the fix for the primary colour transforms breaking the mask and muddying the colours in HDR and DCI-P3 modes.

Hopefully this should address @rafan issues and hopefully @cyber and @Wilch issues on OLEDs although this would have affected all TVs but maybe it was just more apparent on OLEDs.

I think this has brightened the image somewhat as possibly whites are a bit more white but not sure.

Anyway let me know what you think.

Thanks to @rafan (and hopefully @cyber) for his attention to detail in this as it’s really helped and is greatly appreciated.

Anyway before I jump the gun let’s see if it fixes a few things!

LCD Photos: OnePlus 8 Pro Camera: Pro Mode, ISO 200, WB 6500K, Aperture Speed 1/60, Auto Focus, 48MPixel JPEG.

5 Likes

Excellent!

Looking forward to testing this!

You’re most welcome!

Wow! Those “phosphors” look like phosphors!

2 Likes

CRT-Sony-Megatron-Sony-PVM-2730-HDR

HDR On in RetroArch/Windows, HDR On in Preset

RWGB (OLED) Subpixel Layout, 300TVL, NTSC-U

Note: the dark grey horizontal line is from the shader parameters menu.

RWGB OLED Photos: Samsung Galaxy A71 Camera: Pro Mode, ISO 100, WB 5000K, Aperture Speed 1/60, Manual Focus.

Oh yeah, it’s fixed alright!

Next stop, verifying if the Slot Mask presets work fine with OLED and if not designing one that does?

2 Likes

Great stuff! Thanks @cyber for confirming this. The trick was to separate out the rec.2020 primaries conversion from the ST.2048 PQ conversion doing the former before the scanline and mask simulation and doing the latter (ST.2048) conversion after it.

2 Likes

@Cyber slightly unrelated question: what game are your photos from - I kind of recognise that font.

1 Like

Oh, that’s F-Zero, SNES.

1 Like

So after a some side by side comparisons I think I’m going to add an option to move the HDR primaries before and after as there is a clear trade off between the two in terms of image quality. One (moving HDR primaries before mask and scanlines) gives a crisper more accurate mask however moving them after give more accurate colours and slightly softer look.

The reason for this is that in HDR to move for example green back to the hue it has in SDR you have to add red to it i.e make it slightly yellower which is the rec.709 green we all know and love compared to the greener green of rec.2020.

This creates a problem for our mask as the mask breaks out this red into its own virtual phosphor and has the effect of over emphasising it giving all the greens a slightly more yellower look than if you do this conversion after the mask has been done. You can see this clearly on the opening map screen of Super Mario World on SNES but its there everywhere when you look.

So you can mitigate this effect somewhat by changing the colour temp to being a higher temperature resulting in a cooler/bluer image. I’ve found around the 3000 Kevlin mark seems be a good spot but you could probably go higher. However this then gets away from accuracy of the colour system NTSC-U/PAL is supposed to use D65 (6500K) whereas NTSC-J is supposed to use D93 (9300K) and we’re effectively throwing that all away.

Anyway long and short I’m not sure we’re going to get around the fact we have to do this colour conversion for HDR to make it look right and that just doesn’t play nicely with masks. Possibly the best we can hope for at this point is really bright SDR displays but they don’t look like they will ever get as bright as their HDR counterparts.

Mind you having said all that its quite difficult to tell the difference unless you know what you’re looking for. I personally prefer the original more accurate colour with the ever so slightly softer mask but that will be display, mask and user dependent!

4 Likes

Yeah, I was thinking about this, as well. We’re ultimately constrained by the color of the LCD subpixels themselves, since they’re immutable. The only way to change a perceived color from the subpixels’ own color is to add another subpixel to it at some level of brightness, thereby polluting our pure, perfect mask.

3 Likes

Just to add that although the green axis is the worst (as it has the largest shift in rec.2020) the colour temp trick won’t work for the same problems found in the red and blue axis so I’m not sure how much runway that really has.

1 Like

Hi and thanks for your answer. Except in the shader menu I don’t see any menu about HDR in retroarch. I tested by enabling hdr through windows and nvdia menus but without success. Concerning the sdr shaders, they are much too dark on my screen to really enjoy them.

1 Like

If colourspace transformations are affecting the mask to the point of creating additional subpixels then how is this so much different from other shaders which deliberately affect the mask using effects like Bloom for example?

If the goal is to achieve an as accurate as possible solution, I think the mask needs to be static/untouched and whatever accommodations are required to have accurate colour from an output and visual perspective should probably be worked on and applied to the “video signal”/“electron beam” even if the current theoretical methods of achieving such might be inaccurate.

It’s good to have the option to have things either way though.

2 Likes

Hmm so it sounds like you’re possibly not using an up to date version of Retroarch - I’d recommend 1.10.0 at least (I think that’s right). If everything is working correctly and you’ve chosen the Vulkan, D3D11 or D3D12 drivers then you should see it under Settings->Video->HDR (see the very first post in thread for instructions).

Yes! Fixes all the issues I was seeing, thank you! :smiley:

Just awesome :star_struck:

Yeah so when a gamut conversion is done the primary colors can only be simulated by this break-out (which makes sense). I noticed it myself too when doing a test case with red (an additional green virtual phosphor is lighted on a pure red screen, to make the DCI/2020 red more soft (like 709 red).

In itself there’s not much wrong with the break-out in virtual phosphors imo. The same would happen when (relatively) you would display a more yellowish green on a CRT than the native primary green of the CRT. I.e the virtual phosphor mask behaves like it should.

I can understand it becomes an issue when in your case you’re seeing greens (with the broken out virtual phosphors in green and with a slightly different hue than before or what you’re used to.

However a few thoughts come to mind:

  • What is the correct green? :slight_smile: Ideally you would verify with a CRT of which you know by certainty that it is calibrated to exact the 709 colour gamut (phosphor primaries are exactly like what your matrix in the shader is based off) AND gamma is calibrated the same. That would eliminate the subjective factor and just stick with what’s real CRT green hue.
  • I noticed that the color gamut conversion and the hue impression that is there at output is quite sensitive to gamma changes. So you may want to change gamma in and out a bit to see what it does to the hue.

In that regard there’s a more general thing on my mind with the gamma curves. Some thoughts coming up, would be interesting what your view is on the matter.

For the CRT era, one could discuss what gamma curve/function would best describe a CRT. This could be any of

  • pure power law function
  • 709
  • sRGB
  • Some totally whacked function because back in the day we would turn the “Contrast” knob on our TV’s and monitors probably a bit higher than the studio recommended “100 nits” (which is quite dim if you calibrate a CRT to that), and we would turn the “Brightness” to something that in daylight we could see our screens, totally whacking the “studio recommended” 2.2 gamma black level while being in dark room. I’m sure these analog Contrast and Brightness settings did not make for a clean single “gamma exponent” value change, but actually affected the curve function by quite a bit.

From my understanding sRGB calibrated curves is a thing from 1996 CRTs onwards. So what would best describe 80’s CRTs? Power law gamma?

Then on the output (display) side we also have a multitude of possible functions, especially with the addition of HDR.

The issue at hand is what’s best describing the CRT we’re simulating (what gamma curve and values) and what is the display device we’re using (what curve and values)? SDR led? HDR led? Chances are the CRT curve of the simulated CRT and the output curve (your monitor) will differ and ideally we would compensate for that difference.

Since we don’t know for sure what curve (power law, 709 or sRGB) would best describe the CRT we’re simulating, especially in regards to the wild west of the 80’s, I’m thinking it could make sense for the shader to make these functions separate from the colour gamut.

In the current shader setup when you choose 709 primaries colour gamut (which -could- be correct for 80’s CRT) then the 709 gamma curve is stitched to that. Same for sRGB. But maybe pure power law would better describe the 80s CRT?

Maybe ideally you can select a colour gamut and the gamma function seperately from each other. As this freedom would allow for possibly closer/better simulation of CRTs from all era (80’s through to 90’s).

My thoughts so far bring me to believe that for the gamma curves in the shader that are now stitched to the specific color spaces, it would be great to have them seperated, just that for example we could accommodate above case.

Ideally on the output side we could do the same. Currently power law gamma is stitched to the DCI space, while it could make a lot of sense to have sRGB curve in this case as possibly many current monitors have a “target” of DCI-color gamut (like e.g. 90% coverage or what not), but their gamma is actually better described by sRGB than by power law.

So the short story is whether it could make sense to make the gamma_in (CRT) curves and output curves seperate parameters, such that the flexibility of mix and match color spaces with gamma functions is provided for more flexibility to accurately match/simulate the wide range of CRTs out there on the wide range of (O)LEDs out there.

So for both the “Your Display” section and “CRT” section a “Gamma Type” could be introduced, that has the various gamma functions as a parameter broken out from the color spaces (but with same names for easy identification that “by default” they should be matched up):

So for the “YOUR DISPLAY’S SETTINGS:” it could be something like:

SDR: Display's Colour Space: r709 | sRGB | DCI-P3"    
SDR: Gamma Type: r709 | sRGB | DCI-P3" 
SDR: Gamma Value: (1.0 - 5.0)                                          

and on the “CRT SETTINGS:” it could be something like

Colour System: r709 | PAL | NTSC-U | NTSC-J
Colour System Gamma Type: r709 | sRGB | Power Law
Colour System Gamma Value: (1.0 - 5.0)
1 Like

So you’re absolutely right in saying switching on these extra sub pixels is wrong when strictly considering a CRT screen and what we’ve highlighted here is that in order to use HDR and therefore the rec.2020 colour primaries, we have to rotate pure green into red to look right and so break this ideal.

The only real thing I can say is that blooms and blurs are there to turn on subpixels to simulate luminance (which our eyes are far more sensitive to) whereas in this case we’re turning on subpixels to simulate chromaticity which are eyes are less sensitive to (this is how jpeg works) i.e you could argue it’s much more noticeable/inaccurate turning on those subpixels for blooms and blurs compared to turning them on to get the right colour in HDR.

I’ll let you decide how much water you think that argument holds but you can definitely see the difference between this shader and most (all?) of the presets in the presets folder for instance.

Basically I just have to be pragmatic on this and accept there doesn’t appear to be a way currently to get HDR levels of luminance without shifts in chromacity.

4 Likes

So yeah that suggestion makes sense the only thing I’d say is that maybe it over complicates things? Mind you we’re over complicated as it is so what’s a bit more complication. Let me have a think about it I need to stream line the shader a bit now anyway as it’s getting a bit gnarly in my opinion.

With regards to what the old colours looked like sure it maybe a bit subjective but I do have two Sony 2730 PVMs so I’ll do a side by side and just check but I’m pretty sure they’re not as yellow in the greens. I’ll check though - I’ll post some pictures for you.

2 Likes

Just added another pull request for Sony Megatron V4.1 which adds a switch to switch between what I’m currently thinking is more colour accurate (soon to be proved/unproved) and what is mask accurate.

This lets you easily see the differences if nothing else. I’ve defaulted it to the shaders old behaviour (as in colour accuracy) right now but it’s totally up in the air what should be the default and whether it being ‘colour accurate’ is correct.

4 Likes

Hi @MajorPainTheCactus

I’ve been testing HDR mode a bit again and I don’t seem to be able to get as good/accurate a Gray Ramp in HDR mode compared to SDR mode. Where in SDR mode I can tweak the Gray Ramp pretty much too perfection, in HDR always something seems off, either the whites remain blown out, or I can balance the whites but then the whole screen is too dark etc. I’ve tried everything in the book (max nits, paper white, gamma, contrast).

Just to verify with you: are you able to get as well a balanced Gray Ramp in the 240p test suite in HDR mode as in SDR mode?

2 Likes