Sony Megatron Colour Video Monitor

Yes so I was under the impression what differentiated sRGB to rec. 601/709 was that sRGB had an eotf gamma of 2.4 and rec. 601/709 (ITU) had an eotf gamma of 2.2. You can see this in the ‘Transfer Function’ section of this document from the Khronos group: https://registry.khronos.org/DataFormat/specs/1.2/dataformat.1.2.pdf where 1/0.45 is a more accurate way of saying 2.2 recurring.

I thought really all the eotf vs oetf functions were was how to get from linear to non linear space and back again and that each device would do this - so the signal output unit on the back of a console would do a standards defined eotf and the signal sent down the cable to the tv would be in non linear/gamma corrected space. Then the tv in the simplest case would directly output that signal but could convert back and fourth using the standards defined OETF and EOTFs.

Are we saying thats not the case or are we saying thats not the case here as we’re dealing with emulator cores and not real hardware? Or that original hardware sent a straight power gamma signal down the cable and all the standards happened in the tv but then how would the tv know a straight power gamma signal was being sent?

1 Like

It’s fixed! Looks much better in the affected games

4 Likes

It is my understanding that standard CRT TVs were, inherently, power gamma. That power gamma is essentially a mathematical representation of what the CRT TV would do to a signal.

So it’s not so much that the TV knew that the signal needed to be decoded using power gamma, but that signals were encoded such that, when they were invariably decoded using power gamma, the result would be correct.

Since modern displays don’t work that way, they are calibrated/tuned to match instead. So, in SDR, Windows encodes the signal using sRGB encoding gamma and my C1 decodes that signal using my choice of power 1.9, power 2.2, power 2.4, or BT1886 depending on the menu options i have selected in that picture mode. My old FW900 would have inherently decoded that same signal using power 2.35(ish). Some people have calibrated their monitor to use sRGB piecewise instead of power (for better or worse, this is how unmodified Win10/11 transfers SDR content into HDR). And so on depending on the display.

Now, at least so far as i understand how Megatron works,

float r601r709ToLinear_1(const float channel)
{
	//return (channel >= 0.081f) ? pow((channel + 0.099f) * (1.0f / 1.099f), (1.0f / 0.45f)) : channel * (1.0f / 4.5f);
	return (channel >= (HCRT_GAMMA_CUTOFF * (1.0f / 1000.0f))) ? pow((channel + 0.099f) * (1.0f / 1.099f), HCRT_GAMMA_IN) : channel * (1.0f / 4.5f);
}

is our decoding/display gamma, effectively taking the SDR display’s place in the chain, with the shader’s end result ultimately being reencoded for decoding by the HDR perceptual quantizer as defined by ST2084.

Trick is, different emulators/cores/games/applications could have been made on displays using who knows what decoding gamma, ranging from approximately power 1.8-2.5, or sRGB piecewise, or even some uncalibrated mess that it would be completely futile to try and match. And, in the case of emulators/cores specifically, what the developer was seeing could have little to do with what the games actually looked like on period CRT displays.

2 Likes

Are you saying that the standards (601/709 or even later sRGB) EOTF’s (and OETF’s) were basically ignored back then? There must have been circuitry devoted to the conversion and those engineers must have followed the TV standards - I can’t believe they all just went with a power gamma and ignored industry standards at least not for professional displays. Do we know whether the signal output by the console is in linear or non linear space? If its non linear then again there must be some circuitry doing that conversion.

My understanding is that they followed industry standards for encoding safe in the knowledge that functionally all extant displays at that time would then decode/display that signal in the same general way as the CRT display that they were using, because all CRTs would “decode” using a power gamma curve by definition. Power gamma being the decoding/display standard was implicit.

But then things changed in the 2000s, everything got weird for a few years, and actual explicit standards for decoding gamma with modern displays were put into place in response, thus BT1886. (Which is the same as power 2.4 on an OLED, but slightly different on an LCD to compensate for LCDs inability to display true black.)

1 Like

While this is all well and good if this is the path to the correct implementation, I’m not one who really pays much attention to the numbers as much as how things look. So if I find things too bright, I darken.

So basically whatever "problems’ existed with respect to Gamma, I would have already worked around in my tweaking and developing of my presets.

So while for some this new update might look better, for me it just messed up the Gamma on most if not all of my presets.

I could go back and retweak them manually but before I go down that road, is there a magic number I can lower or multiply my Gamma settings by which will get me back to how things looked before the update?

On another note, I’m noticing something strange when Slot Mask is selected on my screen. The layout chosen is for WOLED 1.

Deconvergence Off

Deconvergence On

The scanline gaps seem to be making a square/pulse wave-like up and down pattern instead of being in a straight line and the deconvergence settings don’t seem to be having that much of an effect on the individual phosphors.

This creates some very strange looking artifacts on screen when viewed at a normal viewing distance.

I believe the Shadow Mask might also have some issues because it also looks strange at 300TVL using Mask Layout 1 (WOLED) from a normal viewing distance.

I didn’t notice this when the Aperture Grille setting was used.

Lastly, can we have an additional Slot Mask pattern that looks like the examples below, with the vertical black space at the sides of the triads (horizontal gaps) and maybe also the vertical gaps being a bit more pronounced?

1 Like

I played around with the updated shaders a lot this weekend and agree, something appears to be off with the Shadow Mask one on my LG OLED as well.

2 Likes

I wonder if the mask looking “off” is related to what Jamirus and I mentioned above. I described it as looking misaligned, but fixed it by enabling Integer Scale Overscale (not an ideal fix I know). Here’s an example of slot mask in a Sierra AGI game in scummvm with aspect correct enabled (4:3):

1 Like

This does not look correct at all…and it shouldn’t because you’re not using the proper mask layout for your LG OLED TV.

I think I use Integer Scale and Integer Scale OverScale quite a bit and I don’t think that has any bearing on the issue I experienced. I can recheck it though.

I already showed that it’s something that’s happening when I employ Deconvergence.

1 Like

So. I have good news, bad news, and more good news:

First bit of good news: i think i’ve sourced a more objective and satisfying solution to the gamma problem. Page 3 of Recommendation ITU-R BT.1886 (03/2011) includes what is referred to as a “CRT matching” EOTF. Unfortunately, i don’t have a solid grasp on how to implement this in slang, so i can’t really test it on my own before suggesting it. Thoughts and/or assistance anyone?

The bad news: I got lilium’s HDR analysis Reshade tools running with Retroarch, and i’m reasonably certain that the color system/phosphor conversions are currently happening entirely in Rec709/sRGB space, and thus are effectively being clipped to Rec709, at least if “HDR: Original/Vivid” is set to Original. (As you thought, Vivid instead locks/pushes everything to a non-standard “Expanded Rec709” gamut suggested by Microsoft.)

Second bit of (mostly) good news: i’ve already prototyped what seems to be a working solution, but it will require a fair bit of overhauling.

Basically, the function “HDR: Original/Vivid” is tied to in hdr10.h is already set up to be turned into a Color System replacement. We just need to add more transforms to Rec2020 in addition to k709_to_2020 and kExpanded709_to_2020, which can be calculated/generated using https://www.colour-science.org:8010/apps/rgb_colourspace_transformation_matrix or http://color.support/colorspacecalculator.html

For example, kNTSCJP22_to_2020 would be

   0.691014f, 0.261287f, 0.047700f,
   0.109646f, 0.876813f, 0.013541f,
   0.014382f, 0.096629f, 0.888989);

Or, for an option that pushes well outside Rec709, kNTSC1953_to_2020 would be

   0.924130283352921f, 0.058040878058827f, 0.017828838588251f,
   0.081832452209371f, 0.839996416103710f, 0.078171131686919f,
   -0.003138728266940f, 0.033216715568834f, 0.969922012698106f);

This method looks much better by eye and in the analysis tools, but i’m not 100% sure it it is currently doing everything we might want in conjunction with the “Colour Accurate” option? (I think we also want to lock the current color system to 709? But i’m still testing.)

I’ll continue testing and converting the remaining gamuts for now.

I had also already been working on assembling some additional phosphor options for 60s and 70s TVs, so this will be a good time to add those assuming the UI has enough space.

Oh, and Special K can also take .jxr HDR screenshots of Retroarch with HDR shaders included. No idea where to host those off the top of my head tho.

1 Like

So what was that fix in the last PR that made all of my presets very dark all about then?

How would you rate that fix?

IMHO if working on something so critical to a vast multitude of users, perhaps there should be some Beta/Alpha hosting somewhere and a little more thorough testing and consensus before a PR is made to the main branch of RetroArch.

To me these aren’t things that should be interfered with willy-nilly unless and until a better solution is confirmed to exist.

Also, it would be nice if there was some way or some formula to “get back” for those who were satisfied with their settings to not have to go about manually tweaking again.

1 Like

That was a fix for a long-standing bug that was apparently causing tonemapping to be applied twice, resulting in severe artifacting/boosted brightness in darker/near-black colors. It just may not have been happening on your system for some reason we never nailed down.

The part of the shader code I am suggesting a potential alternative option for wasn’t changed in 5.7, and i agree that the current code should probably remain an option unless an alternative ends up being universally preferred, since it is clearly working well as is.

5.7 shouldn’t have made things darker in general tho… can you try reverting just colour_grade.h to the version from 5.6 to verify there isn’t anything else going on?

Also, were your settings used for those sgng pictures somehow darkened compared to the defaults in your preset pack? Because the overall brightness of those presets looks identical to 5.6 on my system, just without the weird artifacting/boosted brightness in dark colors.

1 Like

If it was happening, I might have just increased my Gamma settings until it looked right to me - which is what I actually did.

So after this latest update, my presets, on which I increased the Gamma so far and tested all seemed much darker.

.

I can check it but if testing a dark game, then it might appear darker in general but I wasn’t really differentiating between general darkness and darkness in the near blacks in my statement concerning the matter.

I can’t say for sure as I’ve already moved on from the settings in the first release of my preset pack.

As a matter of fact, I’ve already started adjusting my presets to take the new changes into consideration. I have no choice actually because what was once satisfactory in the games that I had tested so far were now unsatisfactory but if there was a bug that needed fixing that I may have already tried to compensate for by turning some knobs, I’ll just have to compensate for the new behavior.

It’s cool. I’ve been there before.

I still find near blacks pretty dark and now have the Gamma lowered to around less than 2.2.

Previously I had raised the Gamma to 2.45+.

I’m AFK now so that’s the most accurate account I can give you currently.

1 Like

My issue is the alignment of the scanlines. The mask makes no difference. From a distance it looks like there’s a squiggly line every 2 or 3 scanlines. Perhaps my photo doesn’t convey it very well, but thought it might be similar to what you described as “scanline gaps seem to be making a square/pulse wave-like up and down pattern instead of being in a straight line”.

Not sure if this is the same issue rancid is having when using slot mask, but I think it’s the same as Jamirus’ issue discussed further up.

Here’s a photo with and without integer scale overscale enabled (using the correct mask for OLED):

2 Likes

I’ve polished up the prototype gamut fix/improvement a bit and committed it to a fork for testing:

https://github.com/Azurfel/slang-shaders/tree/master/hdr/shaders/include

Only hdr10.h and parameters.h have been changed thus far.

The setting is currently called HDR: Content Color Gamut, replacing HDR: Original/Vivid right under peak luminance and paper white.

Current options are: 0:Rec 709 (previously Original)|1:Expanded Rec 709 (previously Vivid)|2:NTSC 1953|3:Rec 601|4:PAL|5:P22-J|6:P22-80|7:P22-90|8:RPTV|9:Display P3|10:Rec 2020

That ordering and the exact composition are very much preliminary, and the menu option in general is a rather messy WIP. I very much wanted to ensure that that following info was front and center during testing even tho all that extra text makes a bit of a mess of things:

I would strongly recommend setting “Mask Accurate/Colour Accurate” to Colour Accurate, as the fully expanded gamuts further exacerbate Mask Accurate’s issues.

Based on comparisons using a dummy color gamut that fits entirely within rec 709, Colour System should be set to r709, and Phosphors to NONE with these settings.

3 Likes

I just want to say, that I am glad that this shader is getting better and better through you guys, especially with correct color conversion and gamma mapping, as you are doing at the moment. Thank you so much.

In my opinion this is the most authentic shader available today and that it’s very well worth every improvement. It’s just sad, that the reshade port doesn’t seem to work with DirectX9, Open GL and Vulkan, which many Standalone emulators use.

For example, the excellent Redream Dreamcast emulator is based on Open GL and therefore I can’t run it with the shader in HDR. Also the standalone PCSX2 emulator does crash with AutoHDR, despite being configured with DirectX11/12.

I hope, that there is the possibility that the Megatron Shader with AutoHDR will support all graphic API’s in the future. Otherwise my only choice might be to get the new Retrotink 4K scaler with implemented 4K CRT HDR masks, which is independent from software compatibility issues and can just be hooked up between my OLED and PC. The only problem is, that this will be expensive…

2 Likes

Have you tried Flycast? I know that has a DX11 option (or at least it did last i used it).

2 Likes

Yes, I tried the standalone Flycast with reshade, but it also crashes with the AutoHDR files and always resets to OpenGL in the menu. Only if I delete the AutoHDR files in the Flycast main folder, it works with the shader in SDR - with low brightness.

On the other hand, the Flycast Core in Retroarch has graphical glitches and also the 4K CRT mask is not accurate, because the internal resolution is somehow used differently in RetroArch in comparison with the standalone Emulator, as it seems to scale the resolution also externally, which the Megatron shader doesn’t like and doesn’t apply the mask correctly.

The only standalone emulators which I could get to run perfect with reshade and HDR is the PS1 emulator Duckstation and Mame. The Mame standalone works correctly in HDR if I choose the video option “bgfx” under the Mame settings, which should be based on the DX11 or 12 API.

2 Likes

This is how the mask looks with Flycast within Retroarch (please watch in Fullscreen):

The internal resolution is native @ 640x480 and as you can see the CRT mask just looks wrong.

And this is how the mask looks correctly within Duckstation, even with very high internal resolution above 4K:

Sorry to post these pictures with the mask set in SDR, as I have not yet found a way to save screenshots correctly in HDR and view them properly. If someone has a suggestion, that would be very nice, as it would make it a lot easier to post captured screenshots here with higher brightness etc. as they would look when the game is running in HDR, without using a smartphone.

1 Like

This is how the game looks in the standalone Flycast emulator, with correct mask in comparison to Retroarch. But unfortunately without HDR, as it crashes and resets to OpenGL.

And this is why the shader is looking so good:

From a closeup, you can see, that it faithfully resembles an Aperture Grille mask. Just a pure phosphor mask without any of additional bulk processing, which for me personal is not needed and also keeps the shader lightweight and not hardware demanding.

And HDR finally makes it possible to have sufficient brightness too.

It’s really just compatibility, which at the moment is lacking with the missing API’s and the occuring crashes I experienced. Beside this and maybe some finetuning of the Color and Gamma mapping mentioned above, the shader is probably as good as it will ever get in regards to mimicking a CRT.

I think it can even surpass CRT’s in some aspects, as OLED’s overall have no convergence issues and better black level, just to mention a few things.

3 Likes