Sony Megatron Colour Video Monitor

The version already included in pre-shaders-afterglow is mathematically identical to the version in Grade as far as i can see.

Pattern recognition and a rudimentary understanding of code/slang.

Enough to grok that:

src = SMS_bl > 0.0 ? pow(src, vec3(1.0,1.0,1.0/1.16)) : src;

from Grade can be translated to/expressed as:

if (params.sms_blue > 0.5) imgColor.r = imgColor.r * (1.00), imgColor.g = imgColor.g * (1.00), imgColor.b = imgColor.b * (1.16);

in the context of crt-guest-advanced-ntsc/pre-shaders-afterglow.

1 Like

Excellent! Thanks!

Seeing that we’re in a shader quality of life shader feature improvement mood, how hard would it be to implement a cropping feature similar to the one in Mega Bezel Reflection Shader or (although I’m not as familiar with it) the one that’s included in CRT-Guest-Advanced?

1 Like

I’m not sure exactly how it compares to Mega Bezel or CRT-Guest-Advanced’s cropping functions feature-wise, but personally, i prepend image-adjustment to the beginning of my presets for my cropping needs, and it works exactly as i want it to. It masks to true black without any resizing or rescaling whatsoever.

1 Like

I haven’t followed the conversation fully but the next settings (that deviate from default) can render the image mostly noop.

g_signal_type = "0.000000"
g_crtgamut = "0.000000"
g_Dark_to_Dim = "0.000000"
g_GCompress = "0.000000"
wp_temperature = "6504.000000"
g_U_MUL = "1.000000"
g_V_MUL = "1.000000"
g_CRT_br = "1.000000"
g_CRT_sl = "0.000000"
g_vignette = "0.000000"

I say mostly because while colors are intact, the transfer function is not, so the image will turn a lil bit darker (not noticeable unless AB comparison). This is a sideffect of the EOTF_1886a() function which is a simulation of a CRT transfer function, with close to real CRT knob controls.

If you don’t want that either I mean, what’s the point of loading grade at all… I can nullify the effect but it’s a bit of a pain for a corner case scenario.

Here’s a comparison image.

3 Likes

Yeah, for our use case (that is, for maximally clean compatibility with Megatron) it ultimately just made way more sense to port the remaining two Sega functions over to crt-guest-advanced-ntsc, rather than trying to force Grade to run just the Sega functions and absolutely nothing else.

1 Like

@Cyber

In case you are already using my recent Sega additions to crt-guest-advanced-ntsc: i implemented the full Sega MS Nonlinear Blue behavior as described in the Notes & Measures: Nonlinear Blue on Sega Master System 1 & Other Findings document.

1 Like

Thanks @Azurfel. I appreciate what you’re doing and your contributions to the community and especially towards the improvement of the Sony Megatron Color Video Monitor Shader, however I haven’t switched away from using the full Grade implementation nor even the shader combo I posted sans Afterglow passes.

I just felt like I might be playing around with some of what they have to offer at some point. For example the Vignette feature in Grade.

It’s still good that you’re implementing these changes, fixes and additions however. Hopefully we’ll see a more permanent implementation at some point that’s available to all Sony Megatron Color Video Monitor users, however, I’m aware of the strict performance targets and constraints of this particular shader.

So I’m glad it’s there, at some point I might get around to giving things a whirl. I’m having a lot of fun tweaking using the blending offered by the Guest-NTSC with the additional sharpness (well actually blur mitigation) provided by the horizontal filtering section.

Look at how long it took me to play around with that in Sony Megatron to give you an example of how slowly I move on certain things sometimes. That doesn’t mean I don’t appreciate them though as that is a feature that I’ve used extensively in my much older CyberLab Neo-GX shader presets.

Looking forward to the next meaningful consolidation of the main Sony Megatron Color Video Monitor shader itself.

Can you tell me if you’ve achieved all that you set out to with your work on the colour accuracy and tonemapping?

The “HDR: Content Color Gamut” system as it is on my testing fork is functionally feature complete, and has been for about four months. I may add another gamut or two at some point. Mainly ColorMatch RGB and Adobe RGB, tho i’m not aware of any games that actually use either.

The associated shader parameter does admittedly feel kind of messy tho. I would strongly prefer named parameter options (rather than numbers with a key) given the sheer number of gamuts covered, but i’m under the impression that isn’t possible within the framework. (If you know of any slang shaders that use named parameters rather than numbers, please do point me at them.)

@MajorPainTheCactus determined that the color errors/gamut overshoots were the result of very dark/near-black colors being “warped” out of the gamut, presumably due to floating point inaccuracies. I still need to test that mentioned clamp modification for myself. I was quite busy for the last few months, and it totally slipped my mind.

2 Likes

Just finally getting around to testing this now. As mentioned, it totally got lost in the shuffle x.x

I added a menu option to my personal testing build to toggle the clamp on and off, and a difference in color rendition is visible by both my eyes and a friend’s. We both independently came to the conclusion that having the clamp enabled results in colors more similar to the colors seen with shaders turned off, but to be certain, someone with a colorimeter will need to check.

Unfortunately, the “0.000001”/6-decimal version of the clamp does not round down to zero black, at least on my C1.

scanline_colour = clamp(scanline_colour, vec3(0.0000001f), vec3(1.0f));

rounds down to zero black, at the cost of allowing some overshoot back in. I used this modified version of the clamp for the color comparisons, but by my eyes, colors are identical with the 0.0000001/7-decimal and 0.000001/6-decimal versions.

crt-sony-megatron.slang modifications for Gamut Overshoot Fix parameter
(...)
   // User Settings
(...)
   float hcrt_scanline_colour_clamp;
(...)
#include "include/parameters.h"
(...)
#define HCRT_SCANLINE_COLOUR_CLAMP           params.hcrt_scanline_colour_clamp
(...)
      scanline_colour += scanline_channel_2 * kColourMask[channel_2];
   }

   if(HCRT_SCANLINE_COLOUR_CLAMP >= 1.0f)
   {
   if(HCRT_SCANLINE_COLOUR_CLAMP == 1.0f)
      {
//       scanline_colour = clamp(scanline_colour, vec3(0.000000132362252f), vec3(1.0f));
      scanline_colour = clamp(scanline_colour, vec3(0.0000001f), vec3(1.0f));
      }
   else if(HCRT_SCANLINE_COLOUR_CLAMP == 2.0f)
      {
      scanline_colour = clamp(scanline_colour, vec3(0.000001f), vec3(1.0f));
      }
   }
   
   vec3 transformed_colour;

   if(HCRT_COLOUR_ACCURATE >= 1.0f)
(...)

and

#pragma parameter hcrt_scanline_colour_clamp " Gamut Overshoot Fix" 1.0 0.0 2.0 1.0

somewhere in parameters.h.

Gamut Overshoot Fix 0=no fix, 1=the 7-decimal clamp, 2=the 6-decimal clamp.

2 Likes

Greetings @MajorPainTheCactus. I’ve recently began to notice burn-in possibly due to the scanline and scanline gaps wearing out unevenly over the 4:3 area of the viewport.

I was discussing a way to reverse it with @HunterK and we came up with a small shader that shifted the content up and down over time so that the unused pixels that make up the scanline gaps get to do some work and the ones that displayed the scanlines got some rest.

He said that it needed the scaling on the last pass on whatever proceeding shader to be set to “Viewport” and the scaling on that shader to be set to “Viewport” as well.

Guess what happens when I set those to “Viewport” when using Sony Megatron Color Video Monitor?

The thing is, if I use another shader and RetroArch’s built-in settings to enable HDR, there’s no issue with the last pass scaling being set to “Viewport”.

Here’s a link to the conversation over at Reddit.

https://www.reddit.com/r/RetroArch/s/WMtEEw2rxf

1 Like

Why are you concerned? Why are you complaining? You’re basically morphing your OLED into a CRT!

I mean, let the shader run overnight and you’ll have scanlines for free!

Joking ofc, I hope it will recover.

5 Likes

You should really be letting the display run it’s short compensation cycles at least a couple times throughout the day.

Like, you don’t need to obsessively set a timer to make absolutely certain that it runs a cycle at every possible opportunity, but having the display run practically 24/7 isn’t doing it any favors.

What are you referring to exactly, manually invoking the pixel cleaner/refresh/clear panel noise or just having the display automatically run its compensation cycles?

The timer was actually @HunterK’s idea and it’s probably intended to help prevent this specific type of burn-in situation from occurring in the first place.

My TV has pixel shift but that won’t necessarily be activated for this type of content because although the scanline gaps remain static, the scanline themselves are active and dynamic.

That’s the thing eh, if you’re interested in something, into something and passionate about something, then you might find yourself spending more time and doing more of that thing. There were times when I might have left games on in attract mode overnight or left a game on sound test for an extended period. The sound test scenario, would probably result in the screen dimming, while the attract mode probably wouldn’t.

Even without leaving it on overnight, I found my usage mix shifted much more in favour of retro games as opposed to other content. Then remember as a person who designs presets, every viewing opportunity is a chance to either admire, appreciate or improve one’s settings.

So, it is what it is and it happened already so this is about finding a solution and also communicating the potential for this type of thing to occur to the rest of the community so that users can be a bit more vigilant.

I don’t think anyone else has reported burn-in due to scanlines and 4:3 Retro-Gaming content on their OLED TVs but beware that it can happen. Over time I know I’ve read about other user’s concerns about the potential for this. With HDR it might probably be even easier for this to occur.

@hunterk had probably suggested that the shader be integrated into the Sony Megatron Color Video Monitor shader as a workaround for the “Viewport Scaling” bug.

The manual compensation cycle/pixel cleaner/refresh/clear panel noise options run a long compensation cycle. I am not certain of the exact timing for all other models, but on the C1, this takes the better part of an hour, and the display actively prompts you to run one once every 2000 hours.

Short compensation cycles on the C1 take almost exactly 6 minutes, and run automatically after 4 hours of cumulative usage, starting from when you turn off the display. This is why people say you should never turn off an OLED at the outlet/wall/power strip/surge protector, as that interrupts or prevents these short compensation cycles running properly, and they are essential for maintaining the panel.

(Have you noticed that, normally, when you shut off the display, you hear a click, but sometimes, the click is delayed? That means the short compensation cycle was running.)

This is fine.

This is foolish.

TLDR: if no one is actively using it, shut it off, and maybe consider taking a brief break at least a couple times a day to let that short cycle run. If you actually want an attract mode nightlight get an old beater CRT.

This is not necessarily or always a deliberate thing. Sometimes one might fall asleep or maybe the game might just be relaxing. I wouldn’t call it being foolish.

The point is our OLED TVs may not be as immune to burn-in as we might have thought when using these shaders so a word to the wise is sufficient, especially if this is what we use them for primarily.

Also remember that this is an E6P we’re talking about with hundreds if not thousands of hours running CRT Shaders so if after a very extended period of doing things like this and suffering no permanent or long lasting effects, one could be forgiven for thinking that this usage scenario might not be such a cause for concern.

What can then happen and probably did is that the envelope gets pushed a bit further as to what can be considered “safe” for the TV.

The most recent variable was the use of HDR shaders, which I always wondered in the back of my mind if they might be more likely to contribute to burn-in than those “dark” presets I became infamous for.

I’m not losing any sleep over this so I don’t think you should get worked up about it either.

Think of it as more of a reminder and PSA to other OLED Display users out there to be careful.

Ah. It’s probably worth noting that the E6P isn’t a remotely modern OLED by burn-in mitigation standards. The gulf between 2010-2017 models and 2019+ models is absolutely massive in this regard. (I’ve seen mixed reports regarding which side of the line the 2018 C8 is on.)

And 2021+ models received further substantial mitigation improvements, especially those with WBE “EVO” panels.

Which isn’t to say that modern OLEDs are indestructible, just that anything older than a C9 is no more representative of what you can expect for burn-in mitigation from a modern OLED than a launch PlayStation Vita or a 2010 LG 15EL9500.

John Linneman of Digital Foundry has said that he thinks modern OLEDs have a burn-in risk comparable to mid to late 90s CRT monitors, and i think all available evidence does point in that direction.

1 Like

I’m aware of this and this was exactly my point. Even such an old OLED has been performing so well for so long in these scenarios that one can be forgiven for feeling relatively safe when it comes to burn-in potential due to my particular “lifestyle” and usage scenario factors.

However, the use of HDR introduced a new set of variables and this is something that I’ve wondered about and now I have the results to validate those thoughts and concerns.

The other thing is that clear panel noise/pixel refresh/panel clean e.t.c. can only go so far before running out of voltage headroom with which to normalize the pixel brightness uniformity.

It’s also possible that due to the specific pattern of uneven wear, alternating scanline patterns may present a more difficult scenario for the above algorithms and technology to address.

I understand and agree.

This is good to know but this evidence may not necessarily be as applicable to our particular niche as other usage scenarios.

https://youtube.com/shorts/RJSE9zACCvI?si=J--28nZ20SWaUKft

Mark(Blurbusters) said on Twitter that he Visual Studio’d on his OLED monitor daily with the taskbar on for like 2 years straight and he has no burn-in whatsoever, but I think he was running SDR 120 nits. So for SDR 100-120 nits usage it seems like OLED burn in is practically solved.

Running HDR/high nits however is different + the Mega/Cybertron shaders turn off a lot of pixels which means uneven aging.

1 Like

Greetings @MajorPainTheCactus. I’ve noticed that on RTINGS.com there are several different measurements for HDR Peak Brightness and this also varies depending on the age of the article.

For example there’s HDR Real Scene Peak Brightness for some reviews while for others they don’t provide an HDR Real Scene Peak Brightness but instead just provide the different values for the different scene types/tests.

Then below those they would list the various % Window tests, for example 2% Window, 10% Window, 25% Window, 50% Window and 100% Window.

All of this adds some ambiguity and confusion to not only the man on the street but even advance users who might be just getting their feet wet with respect to these things.

Then to add to the confusion, some TV’s test completely differently depending on which Picture Mode they’re in. I use HDR Game mode exclusively for Gaming though.

Can you or someone else who is absolutely certain, state which is the best or appropriate value one should use when encountering this hierarchy of choices?

As an example, this TV can do 915 cd/m² in a 10% Window but it’s HDR Real Scene Peak Brightness is only 437 cd/m². Which is the correct value I should use in Sony Megatron Color Video Monitor for Peak Luminance Value?

In this scenario they used ‘Cinema HDR’ Picture Mode, sometimes the ‘HDR Game’ Mode might have different results. Are we supposed to use the same values for whatever mode they tested as Peak if we’re going to be using Game Mode?

How about this one?

Should I choose 624, 652 or 651?

1 Like

You should contact Rtings themselves, or post in their forums. In any case, those numbers make no sense and should be reversed. 2% window is (almost)always brighter than 100% window, especially on OLED.

Compare those numbers to the recent LG C4 OLED review numbers.

Rtings have gotten a lot better over the years. Their old reviews are very amateurish by comparison.

1 Like