Sony Megatron Colour Video Monitor

I’m aware of this and this was exactly my point. Even such an old OLED has been performing so well for so long in these scenarios that one can be forgiven for feeling relatively safe when it comes to burn-in potential due to my particular “lifestyle” and usage scenario factors.

However, the use of HDR introduced a new set of variables and this is something that I’ve wondered about and now I have the results to validate those thoughts and concerns.

The other thing is that clear panel noise/pixel refresh/panel clean e.t.c. can only go so far before running out of voltage headroom with which to normalize the pixel brightness uniformity.

It’s also possible that due to the specific pattern of uneven wear, alternating scanline patterns may present a more difficult scenario for the above algorithms and technology to address.

I understand and agree.

This is good to know but this evidence may not necessarily be as applicable to our particular niche as other usage scenarios.

https://youtube.com/shorts/RJSE9zACCvI?si=J--28nZ20SWaUKft

Mark(Blurbusters) said on Twitter that he Visual Studio’d on his OLED monitor daily with the taskbar on for like 2 years straight and he has no burn-in whatsoever, but I think he was running SDR 120 nits. So for SDR 100-120 nits usage it seems like OLED burn in is practically solved.

Running HDR/high nits however is different + the Mega/Cybertron shaders turn off a lot of pixels which means uneven aging.

1 Like

Greetings @MajorPainTheCactus. I’ve noticed that on RTINGS.com there are several different measurements for HDR Peak Brightness and this also varies depending on the age of the article.

For example there’s HDR Real Scene Peak Brightness for some reviews while for others they don’t provide an HDR Real Scene Peak Brightness but instead just provide the different values for the different scene types/tests.

Then below those they would list the various % Window tests, for example 2% Window, 10% Window, 25% Window, 50% Window and 100% Window.

All of this adds some ambiguity and confusion to not only the man on the street but even advance users who might be just getting their feet wet with respect to these things.

Then to add to the confusion, some TV’s test completely differently depending on which Picture Mode they’re in. I use HDR Game mode exclusively for Gaming though.

Can you or someone else who is absolutely certain, state which is the best or appropriate value one should use when encountering this hierarchy of choices?

As an example, this TV can do 915 cd/m² in a 10% Window but it’s HDR Real Scene Peak Brightness is only 437 cd/m². Which is the correct value I should use in Sony Megatron Color Video Monitor for Peak Luminance Value?

In this scenario they used ‘Cinema HDR’ Picture Mode, sometimes the ‘HDR Game’ Mode might have different results. Are we supposed to use the same values for whatever mode they tested as Peak if we’re going to be using Game Mode?

How about this one?

Should I choose 624, 652 or 651?

1 Like

You should contact Rtings themselves, or post in their forums. In any case, those numbers make no sense and should be reversed. 2% window is (almost)always brighter than 100% window, especially on OLED.

Compare those numbers to the recent LG C4 OLED review numbers.

Rtings have gotten a lot better over the years. Their old reviews are very amateurish by comparison.

1 Like

My apologies for the confusion. The first set of results was from an LG IPS LED TV, while the second set was from an LG OLED TV.

I can’t seem to get hdr to trigger with opengl or vulkan stuff. It always says “turn on hdr if your monitor supports it” even though it’s always on. I thought I’d seen people in the thread mention the nvidia thing where you set the presentation method to “prefer layered on dxgi swapchain” getting around this but that doesn’t seem to make any difference, except it did stop quitting out of games making the screen dim til I switched hdr off and on again. am I missing something or does autohdr just not work with these? I got megatron running on nearly everything I’d want it on except for this.

1 Like

This is for the WindowCast Core, right?

Can you explain from step 1, what exactly you’re trying to do please?

If using a D3Dxx driver, you have to enable HDR both in Windows and RetroArch.

If using a Vulkan driver in RetroArch you don’t need to enable HDR in Windows first.

I don’t think RetroArch supports HDR when using an OpenGL video driver.

Except for what?

If using AutoHDR then HDR+AutoHDR should be enabled in Windows but HDR should be disabled in RetroArch and also in the Sony Megatron Color Video Monitor Shader Parameters for whatever preset you’re using.

1 Like

You still want to enable Layered DXGI Swapchain when using Vulkan on Retroarch because only in this way the program will be able to work in Borderless screen and still being able to use crt shaders. It will not mess up with any hdr settings, so give it a try anyway as it should be kinda a mandatory nvidia setting for retroarch with vulkan driver.

3 Likes

Sorry I mean the reshade version. Games or emulators that are opengl or vulkan aren’t recognized by the autohdr addon (or windows built in autohdr either). Retroarch is already working fine for me. Also it took me awhile before I realized I had to set the crt height to 540 on a 4k monitor to get no artifacts, not 480 (almost everything I am using it on is running in 640x480, though I’ve started trying out some 800x600 games with crt height set to 720 too). I’m assuming that has to do with the reshade version working with the whole screen, not just the game’s window and 2160 divided into 4 is 540, x 4 being the max you can integer scale up a 640x480 in a 4k window etc. Just for reference for anyone else or in case I’m getting everything wrong.

1 Like