Sony Megatron Colour Video Monitor

Hi so I’ve just created a pull request with the fix you proposed above - it kind of rings bells as I’m starting to remember not deleting the straight forward gamma ^2.2 curve because I was seeing issues. I’d completely forgotten about it though tbh. Thanks for pointing this out, investigating and proposing a solution - its greatly appreciated!

Also the expanded transform is not from the mini engine - its from some Microsoft HDR library they put up for non games developers but can’t remember off the top of my head what it was called - I’ll try and find it for you.

2 Likes

Great stuff thanks - if possible let me know if the 5.7 in the latest pull request fixes that for you! Thanks

1 Like

Ok so its not the brightest HDR TV but it should be good enough. Looks like you shouldn’t use it in HDR ‘game mode’ as it excessively dims small highlights (from RTings) and maybe small highlights are deemed as the phosphor triads we’re outputting here. Who knows but worth a try. You’ve definitely got HDR working in RA as well as in it works when shaders aren’t enabled?

2 Likes

Based on your pull request description/what you said about not deleting/commenting out the old code, should

float r601r709ToLinear_1(const float channel)
{
	//return (channel >= 0.081f) ? pow((channel + 0.099f) * (1.0f / 1.099f), (1.0f / 0.45f)) : channel * (1.0f / 4.5f);
	return (channel >= (HCRT_GAMMA_CUTOFF * (1.0f / 1000.0f))) ? pow((channel + 0.099f) * (1.0f / 1.099f), HCRT_GAMMA_IN) : channel * (1.0f / 4.5f);
}

also be reverted?

That part doesn’t seem to be causing bugginess or anything as far as i know, just checking since it sounded like you might have been aiming to go back to the pre-4.2 transfer function in it’s entirety.

1 Like

Thank you for your observations and tips. I don’t know if I understand your question. But do you want to know if I can have retroarch HDR working when I load a shader?

1 Like

Retroarch has it’s own, separate HDR settings under Setting>Video>HDR, and that offers a good way to check that HDR is working properly for Retroarch at all.

(Specifically, the menu text should get much bright and glowier with HDR enabled.)

2 Likes

Yes, I have Retroarch HDR menu active, when running on TV!

2 Likes

Possibly - this is currently a quick fix. I’d like to keep that function visible for the time being until I’m fully convinced there is no way it can work/its incorrect to do this. Maybe it is just plain wrong to use the 601/709 gamma curve with output from these cores. I suppose they were designed to output to a modern desktop…

Does your text ‘glow’ white in the menu vs when HDR is off? Also put your peak and paper both to 560 or even 600 and see how you go. With video shaders off you need to do that in the RA Settings->Video->HDR menu. When using the Megatron do that under the shader parameters.

2 Likes

I think that makes sense!

And tbh, i see no reason to not include that 601/709 function among other options for the transfer to linear, even?

Any one of pure 2.2, pure 2.4, piecewise sRGB, or 601/709 transfer functions could be the right choice for certain content, especially for the ReShade version of the shader.

(Heck, some early 00s PC games would probably look the most correct with pure 1.9 even, given that early LCD gamma was so low.)

1 Like

Yeah we could add it as an option although atvthe moment I just think its causing bugs. Heres a thought: maybe on old TVs with original hardware it did look like this? Itd be interesting to get to the bottom of this mystery either way: why do we have to use a bog standard 2.2 power curve and not the standards version here?

I am far from an expert on the subject, but i think it’s that the 601/709 transfer function in question was an encoding standard rather than a decoding/display standard? And old CRTs were power gamma, so the decoding/display gamma was power by default?

And the “Why power 2.2?” part is, basically, because 2.2 was the default answer to “what is CRT gamma?” when sRGB was standardized, so it ended up being the target gamma as LCDs took over the market, and power 2.2 kind of became the standard in the PC space by default?

And for why other gamma transfer function options may be desirable:

Actual CRTs supposedly ranged from power 2.1-2.5 depending on brightness and such, with power 2.35-2.5 being the generally accepted average for “correct” in retrospect. In that range, 2.4 is the one modern displays generally include an option for, because…

From 2011 on, Rec. 709 has specified BT.1886, which is equivalent to power 2.4 on OLEDs and other hypothetical displays capable of displaying true black. Before that you would see… uh… “spirited” debate as to whether the correct decoding gamma for Rec. 709 was power 2.4 or power 2.2.

Some people (including whoever was in charge of implementing SDR tonemapping in HDR at Microsoft) have decided that the sRGB piecewise encoding gamma should also be the SDR decoding/display gamma on PC, so some games look most correct using that piecewise curve.

As mentioned, early LCDs had rather low gamma (as did some plasmas i think?). Power 1.9 seems to be the shorthand/compatibility option for that nowadays, but it was all over the place iirc.

Mac OS X defaulted to power 1.8 until Snow Leopard in mid-2009, when they switched to power 2.2 by default.

iPhones 1-3 were also 1.8, then the iPhone 4 was 2.68, iPhone 5 was 2.36, and the iPhone 6 on have all been 2.2(ish). Not necessarily the most relevant for a CRT shader, but i know some people like to use CRT shaders for pixel art/low-res games even if they were designed for LCDs.

1 Like

Ah, yep, if you look at the full standard, that is indeed the encoding OETF:

Recommendation ITU-R BT.601-7 (03/2011)

(p6) “In typical production practice the encoding function of image sources is adjusted so that the final picture has the desired look, as viewed on a reference monitor having the reference decoding function of Rec. ITU-R BT.1886, in the reference viewing environment defined in Rec. ITU-R BT.2035.”

So power 2.4 would (effectively) be the (modern) standard for decoding/displaying Rec 601 without getting into the weeds of implementing BT1886’s particulars (which may not be worth bothering with given that BT1886 wasn’t standardized until 2011, well after the age of the CRT ended.)

1 Like

Having played around with 5.7 some more, i definitely think leaving

float r601r709ToLinear_1(const float channel)
{
	//return (channel >= 0.081f) ? pow((channel + 0.099f) * (1.0f / 1.099f), (1.0f / 0.45f)) : channel * (1.0f / 4.5f);
	return (channel >= (HCRT_GAMMA_CUTOFF * (1.0f / 1000.0f))) ? pow((channel + 0.099f) * (1.0f / 1.099f), HCRT_GAMMA_IN) : channel * (1.0f / 4.5f);
}

as is would be the best course, at least the time being.

Maybe the default setting for Gamma Cutoff should be changed to 81?

I’m still not sure if defaulting Gamma to 2.2 or 2.4 would be best. I’ve been favoring using 2.4, but i used an FW900 (2.35ish gamma) for like 15 years before i got the C1, so 2.4 gamma is more or less what i’m used to for emulation whether that is producing the most accurate results or not.

Side by side comparisons with actual systems on actual CRTs to see which comes closer seems like the best course here, and that answer could of course vary from core to core.

2 Likes

Yes so I was under the impression what differentiated sRGB to rec. 601/709 was that sRGB had an eotf gamma of 2.4 and rec. 601/709 (ITU) had an eotf gamma of 2.2. You can see this in the ‘Transfer Function’ section of this document from the Khronos group: https://registry.khronos.org/DataFormat/specs/1.2/dataformat.1.2.pdf where 1/0.45 is a more accurate way of saying 2.2 recurring.

I thought really all the eotf vs oetf functions were was how to get from linear to non linear space and back again and that each device would do this - so the signal output unit on the back of a console would do a standards defined eotf and the signal sent down the cable to the tv would be in non linear/gamma corrected space. Then the tv in the simplest case would directly output that signal but could convert back and fourth using the standards defined OETF and EOTFs.

Are we saying thats not the case or are we saying thats not the case here as we’re dealing with emulator cores and not real hardware? Or that original hardware sent a straight power gamma signal down the cable and all the standards happened in the tv but then how would the tv know a straight power gamma signal was being sent?

1 Like

It’s fixed! Looks much better in the affected games

4 Likes

It is my understanding that standard CRT TVs were, inherently, power gamma. That power gamma is essentially a mathematical representation of what the CRT TV would do to a signal.

So it’s not so much that the TV knew that the signal needed to be decoded using power gamma, but that signals were encoded such that, when they were invariably decoded using power gamma, the result would be correct.

Since modern displays don’t work that way, they are calibrated/tuned to match instead. So, in SDR, Windows encodes the signal using sRGB encoding gamma and my C1 decodes that signal using my choice of power 1.9, power 2.2, power 2.4, or BT1886 depending on the menu options i have selected in that picture mode. My old FW900 would have inherently decoded that same signal using power 2.35(ish). Some people have calibrated their monitor to use sRGB piecewise instead of power (for better or worse, this is how unmodified Win10/11 transfers SDR content into HDR). And so on depending on the display.

Now, at least so far as i understand how Megatron works,

float r601r709ToLinear_1(const float channel)
{
	//return (channel >= 0.081f) ? pow((channel + 0.099f) * (1.0f / 1.099f), (1.0f / 0.45f)) : channel * (1.0f / 4.5f);
	return (channel >= (HCRT_GAMMA_CUTOFF * (1.0f / 1000.0f))) ? pow((channel + 0.099f) * (1.0f / 1.099f), HCRT_GAMMA_IN) : channel * (1.0f / 4.5f);
}

is our decoding/display gamma, effectively taking the SDR display’s place in the chain, with the shader’s end result ultimately being reencoded for decoding by the HDR perceptual quantizer as defined by ST2084.

Trick is, different emulators/cores/games/applications could have been made on displays using who knows what decoding gamma, ranging from approximately power 1.8-2.5, or sRGB piecewise, or even some uncalibrated mess that it would be completely futile to try and match. And, in the case of emulators/cores specifically, what the developer was seeing could have little to do with what the games actually looked like on period CRT displays.

2 Likes

Are you saying that the standards (601/709 or even later sRGB) EOTF’s (and OETF’s) were basically ignored back then? There must have been circuitry devoted to the conversion and those engineers must have followed the TV standards - I can’t believe they all just went with a power gamma and ignored industry standards at least not for professional displays. Do we know whether the signal output by the console is in linear or non linear space? If its non linear then again there must be some circuitry doing that conversion.

My understanding is that they followed industry standards for encoding safe in the knowledge that functionally all extant displays at that time would then decode/display that signal in the same general way as the CRT display that they were using, because all CRTs would “decode” using a power gamma curve by definition. Power gamma being the decoding/display standard was implicit.

But then things changed in the 2000s, everything got weird for a few years, and actual explicit standards for decoding gamma with modern displays were put into place in response, thus BT1886. (Which is the same as power 2.4 on an OLED, but slightly different on an LCD to compensate for LCDs inability to display true black.)

1 Like

While this is all well and good if this is the path to the correct implementation, I’m not one who really pays much attention to the numbers as much as how things look. So if I find things too bright, I darken.

So basically whatever "problems’ existed with respect to Gamma, I would have already worked around in my tweaking and developing of my presets.

So while for some this new update might look better, for me it just messed up the Gamma on most if not all of my presets.

I could go back and retweak them manually but before I go down that road, is there a magic number I can lower or multiply my Gamma settings by which will get me back to how things looked before the update?

On another note, I’m noticing something strange when Slot Mask is selected on my screen. The layout chosen is for WOLED 1.

Deconvergence Off

Deconvergence On

The scanline gaps seem to be making a square/pulse wave-like up and down pattern instead of being in a straight line and the deconvergence settings don’t seem to be having that much of an effect on the individual phosphors.

This creates some very strange looking artifacts on screen when viewed at a normal viewing distance.

I believe the Shadow Mask might also have some issues because it also looks strange at 300TVL using Mask Layout 1 (WOLED) from a normal viewing distance.

I didn’t notice this when the Aperture Grille setting was used.

Lastly, can we have an additional Slot Mask pattern that looks like the examples below, with the vertical black space at the sides of the triads (horizontal gaps) and maybe also the vertical gaps being a bit more pronounced?

1 Like