Sony Megatron Colour Video Monitor

Yes the blues are off which is where I am hoping the white balance will come into help (so fear not! :grin:).

I think the shape of the scanlines are better matching though and if worse comes to worse after I play about with the white balance and it doesn’t help to the degree I’m wanting I’ll add channel specific curves.

3 Likes

Definitely try this latest shader as it uses a RGBX mask. It might help matters but I think @Nesguy is right and it’s going to difficult to achieve the quality level with W-OLEDs.

If you can do me a favour when/if you try the new version can you take a close up image with the camera settings I’ve detailed on my posts and post them back here for me. Ideally of Link on SNES - it doesn’t matter where he is.

I don’t have access to an WOLED so this would help me greatly in seeing if there is anything I can do.

1 Like

Yes I’m definitely thinking my DisplayHDR 600 display is now not bright enough due to the new mask RGBX which obviously reduces brightness by 25% over a straight RGB mask.

This shader is all about the performance of the users display and simulating accuracy so I’m not going to use other masks if it moves the phosphor shape and layout away from my 600TVL CRT to say gain brightness (although the user can swap masks via resolution pattern - I should change that name!)

I think a DisplayHDR 1000 monitor is going to be the base level for getting near PVM level brightness levels.

10,000 nits is the top end of the Rec. 2020 standard and so your brief 10,000 nits might be the reason 10,000 was chosen for it (I had wondered why). I believe LCDs work kind of like blinds on a super bright window. I also believe LCDs are probably better for more accurately emulating CRTs at the moment because of their poor dark levels i.e light bleeding. My PVMs light bleed all over the place BUT that may well be because they are so damn bright - see white text on black backgrounds.

Having said all that the above pictures of my shaders aren’t really representative of the brightness in person. Here’s a screen shot using normal android photo mode (it’s a bit over brightened but not by much).

5 Likes

Thank you for adding the HDR options into retroarch, a game changer for usability of CRT shaders :slight_smile:

Nice comparison shots also! Intriguing approach to CRT simulation…

For an apples to apples comparison it would be good to know whether blacklevel, brightness, gamma (curve) and contrast are known for your CRT? (If you have a colorimeter…)

Is the CRT set to a “normal” brightness level (say 150 cd/m2) for “normal” dim lighted room or is the CRT brightness maxed out? You’ll make it harder on yourself in the comparison if the CRT brightness is maxed out.

Another thing that could have some impact is the gamma (curve) of the CRT versus the LED. A higher gamma on the LED (some are 2.4 out of the box) may set you back in your brightness comparison when the CRT is at 2.2 (or lower when brightness and contrast are set high) .

In a similar way a black level difference may cause unwanted apples to oranges effects. These are exposed quite quickly when wathing the CRT and LED in a dark room side by side. Raising the black level on the CRT feels unnatural if your LED has higher black level, but will lead to a better comparison contrast wise for dark scenes.

Anyway good going on the HDR work and the comparisons.

With regards to the HDR options in RA I have some questions which would be great if you could explain a bit.

What gamut is active when HDR is “on” in RA (and windows) , is it rec709 / srGB?

And what is the “expand gamut” actually doing? What gamut is active when this option is “on”. Is it Rec2020 or my monitors native gamut? My monitors native gamut is Wide Color Gamut, but smaller than Rec2020 (as far as I know there are no 100% volume Rec2020 monitors in the market).

So I’m confused in what color gamut I’m arrving when setting “expand gamut”. Would be great if you could shed some light…

As a sidenote. Maybe it’s interesting to some who don’t have access to a colorimeter, the below VESA tool reports on the native gamut / color primaries and white point reported by the monitor.

3 Likes

Hi @rafan thanks! So my CRT is a Sony PVM 2730QM and it doesn’t have an OSD which tells you an amount of brightness or colour or hue etc What it does have is a ‘NORM’ button which sets the PVM to what the manual terms ‘standard’ settings and this is what my PVM uses (I dont have a colorimeter to do any fine tuning). This is definitely not the brightest the PVM can go to and looks about in the middle of its range.

So although 150cd/m2 is a good measure of brightness over the whole screen this is not what we’re interested in when using HDR for CRT simulation. What we’re trying to do with HDR is get the individual pixels of the LCD/OLED to match the luminance of the individual phosphors in CRT (Note its candela’s over meters squared in 150cd/m2 so its not very useful figure at the microscopic level). There are a number of factors that mean a LCD needs to be far brighter over the whole screen when trying to match the luminance of a CRT.

This includes the fact that when measuring luminance for a LCD people will generally have pure white on all pixels meaning every single sub pixel is at max brightness. When emulating a CRT on an LCD only a fraction of those sub pixels will be at full brightness on a pure white screen. This is primarily because we emulate scanlines and so the center is at peak brightness and as we go out from that center point vertically it gets darker. Also we use a mask to mimic the phosphor shape and layout that will turn off 9 in 12 sub pixels horizontally.

So as you can see in just those subpixels that are on you have to be a lot brighter to get your overall screen brightness to be simillar to that of a CRT. A CRT will be A LOT brighter than 150cd/m2 at the center points on its scanlines as it too has phosphor wires (and scanlines) dulling down the overall screen brightness.

With regards to gamma curves remember we’re not using a SDR LCD and so we’re not using a gamma curve as such instead we’re using a transfer function called Perceptual Quantizer otherwise known as ST2048 (part of the Rec 2020 standard) which allows us to move from 0.0001 nits to 10000 nits. Mapping one onto the other is a bit of an art at the moment rather than an exact science hence why in my latest shader I provide a CRT Gamma value to map back into linear space and inverse tonemapper to map out to the full dynamic rage of HDR and then use a (slightly) expanded gamut into Rec 2020 space to make up for saturation issues when doing all this.

With regards to colour gamuts in general, HDR uses rec.2020 not rec.709 or sRGB or rec.601 (which CRT’s generally used) although your monitor will undoubtedly be using some other colour gamut than rec.2020 as no display can fully reproduce the rec. 2020 colour gamut. Rec.2020 is output by your graphics card and used by graphics API’s and then its left up to your display to convert that colour gamut to the one your display can actually output.

6 Likes

Not the same phone (Oneplus 5T) but with the settings you requested:

HSM Glass w/HDR (720/200, 5 contrast)

crt-sony-pvm-2730-4k-hdr w/HDR

crt-sony-pvm-2730-4k-hdr w/HDR detail

HSM looks very bright; the hdr shader looks washed out and whites look peach colored in real life.

3 Likes

Thoughtful explanation, thanks!

From your explanation I understand that when HDR is turned on in windows desktop then Rec.2020 is set?

However when I measure the primaries with my colorimeter on the windows desktop with HDR=ON then DisplayCal reports the primaries are almost identical to sRGB primaries ( like the values here https://en.wikipedia.org/wiki/Rec._709#Primary_chromaticities )

So how does that work out with the Retroarch-HDR option “Expand Gamut” OFF and ON?

Is it like this: Retroarch HDR “Expand Gamut” = “OFF” -> rec709 set in 2020 space -> gfx card ouputs rec 709 primaries set in 2020 space -> monitor maps rec 709 primaries into colour gamut that my display can actually output -> real output is close to rec709 primaries

Please correct if needed.

Then what happens when I set Retroarch “Expand Gamut” = “ON”. What primaries are in this case outputted by the display? The native monitor primaries?

EDIT: clarification I’m talking only about the color primaries, so not the transfer function which you explained about.

It might have something to do with HDR PC Mode. If I tell the TV my PC is a console, I do get an expanded gamut. Otherwise it seems to use SRGB.

Wow that looks absolutely terrible - no wonder you’ve been complaining! Hmm what can we do? I think the first thing I can do is add in the OLED mask everybody has been talking about. That will move us away from accuracy though but is probably the best we can do given the physical layout of W-OLED.

Those pictures are fantastic though - amazing quality. I’ll have a further think about this and see if there is some other thing we can do to improve the situation.

2 Likes

Yes rec.2020 is used internally when HDR is switched on AND the app supports HDR. This will typically be all the way out to the internals of your monitor at which point your monitor will then convert the colour gamut into a narrower range that your monitor supports.

Presumably your monitor is outputting sRGB primaries that it’s converted from rec. 2020 or is not displaying HDR content. I’ve never used a colorimeter so am not sure on the ins and outs of what it’s telling you.

You should probably ignore ‘expand gamut’ it’s for slightly modifying the rec.2020 colour gamut. So off means transform from rec. 709 to the standard rec. 2020 and on means transform from rec .709 to a slightly expanded rec.2020 to make up for the desaturation that occurs when coming from rec 709 content as opposed to rec.2020 content. Arguably I should just hide this option.

This happens all on the gfx card and that always outputs rec.2020 standard when HDR is switched on in RA and your system fully supports HDR through the display pipeline (the expanded rec. 2020 is clipped down to standard rec.2020).

1 Like

HDR Vulkan still doesn’t work for me when in Fullscreen (Black Screen, HDR detect by the TV). Sadness. GTX 1080 ti, LG C1, Windows 11. Everything updated, tried with clean installs (driver and Retroarch).

I’m not sure but I’m guessing your 1080Ti might be the weak point in the chain here. I check available formats in HDR and maybe your gfx card isnt reporting any supported HDR formats that RA can use.

It works in windowed mode though?

You are using RA 1.10 as well?

This is interesting to me because I am on a GTX 980 and Windows 11 (22000.466). I have HDR enabled in Windows and have the latest stable release of Retroarch (1.10). My TV is an LG C9. Everything is working correctly.

1 Like

Thats good to hear so my guess was wrong! @JHorbach1 you are definitely using RA 1.10?

1 Like

I’m sorry if I’m giving you a hard time. I just want the shader to be better for the community; I know you know this. Let me know how I can help. I am astonished by how many people, like you, are willing to give me and others the time of day for our hobby. I haven’t said it before but I am so grateful for your work on HDR with Retroarch. It’s difficult to convey how much I appreciate your hard work. :+1: :hugs:

5 Likes

Yes, RetroArch is updated (nighly), it works in windowed mode, also DX12 and 11 work fine with HDR in Fullscreen.

I use two monitors, I read that it could cause some problems, I disabled the other monitor, but still no good.

3 Likes

Hmm another attempt I’m not sure whether we are better or worse - in person I think it looks better but looking at the photos below I’m not convinced. Cetainly if you now zoom in on both these images they are very close - infact you can do this by just clicking on the image it seems.

I added white correction but sadly that didn’t do what I wanted really which is to skew things in the blue direction without effecting the reds or greens. I added both tint and temperature to adjust both axis of the colour temperature spectrum but I seemed to just get further away rather than nearer. But I will continue fiddling and comparing with my PVM.

Anyway with that failed attempt I decided to add the ability to change my curves on a per channel basis. I then adjusted the blue channel scanlines as can be seen below. This seems to be the fix and actually doesn’t impact that image that much I guess because we’re not that sensitive to blues. However maybe I need to go back to my colour correction to see if I can get them a little more matched up.

I think the greens might be off a bit - you can see this in the center of his shield.

OnePlus 8 Pro Camera: Pro Mode, ISO 100, WB 3510K, Aperture Speed 1/60, about 10cm from the screen, 48MPixel JPEG.

5 Likes

That looks amazing! Looks future proof to me

2 Likes

ah, nice. That was going to be my next suggestion, since the other phosphors looked to be the right size. :slight_smile:

It looks like the low end of the greens and reds (and maybe the very high end of the greens, too?) could be bumped up a bit.

In any event, it’s looking extremely close already, to the point that it’s now within the margin of individual devices/calibrations, I would say.

3 Likes

Yup that’s exactly it isn’t it: the low end greens and reds. I just saw that myself in the forum pictures it’s a great way to compare! I’ll adjust it tomorrow and see what it looks like. Thanks!

2 Likes