Sony Megatron Colour Video Monitor

Yes, the raised black level is only an issue when there’s ambient light. The brightness also isn’t as much of an improvement over OLED as I would have hoped. I’d probably still choose QD-OLED over OLED, though.

2 Likes

The main reason I got a QD-OLED was that the wife was complaining about our 10 year old 1080p Sony’s remote not working and so I decided to fix the dint with a mallet. :rofl:

The thing Id say about WRGB is obviously that white element is used as a flood light to brighten up the highlights and Im not sure how it really works in our case - when/how does the tv determine when to switch it on in our rather odd use case (at least with 100% masks). Id also normally expect to see unexpected shifts in gradients as it gets turned on. But obviously its they are great tvs so its just a case of understanding these things.

3 Likes

One slightly reassuring thing is that Rtings who has been running the burn-in stress test suspects the worse retention on the QD OLED screens is caused primarily by static white content, so shouldn’t be be such a big issue when it comes to emulation at least?

3 Likes

Yes at least with the Megatron as that only has 100% masks but probably all shaders with some amount of masks. It must be a heat dissipation thing of closely packed elements that are fully emitting that doesnt really happen with the aforementioned masks and scanlines. Id be very interested in how ABL is effected by these shaders.

3 Likes

Another way of looking at this is that since white is not generated by using actual white in these shaders, the white subpixels become underutilized, negating any burn-in advantage of WRGB/W-OLED over QD-OLED in this regard. So perhaps the burn-in risk for this particular use case might be just as high/bad as QD-OLED?

3 Likes

Has ABL been an issue on the S95B when using the Megatron? All I can say is the photos you posted a while back look fantastic. Have you encountered any issues aside from the mask/color problem (which would affect any recent TV)? Do highlights pop the way they should against a dark background, etc? (this was an issue on the QLEDs as I recall)

2 Likes

I see here that you’ve done some interesting things to the settings, while the SDR presets have very different values, resulting in a sub-optimal image. Any plans to update the SDR presets? I think the Megatron would reach a much wider audience that way (although it’s really meant to be experienced in HDR)

Also I think the shader has been updated quite a bit since these photos were taken, so the same settings shown here won’t necessarily produce the same results.

1 Like

Ok so is that when the white sub pixels get turned on - when all three RGB sub elements are on? This would line up with what Ive seen in your various photos I suppose (as in I never see the white sub element on). Doesnt that mean that with anything that uses near 100% masks will be really quite dark? I suppose what we need is a colorimeter to test what the luminance is as I suspect we’d get a lot of unexpected values out of various displays with these shaders.

2 Likes

No ABL hasnt as far as I know but Id be really quite interested in what unhinging the brightness would do (if you could do it) with the Megatron. As for QD-OLED itself its pretty good perfect blacks and little light bleeding - its just the weird triad layout it uses but it doesnt really effect these shaders at least for slot and aperture grille - I dont think Ive really tried dot masks.

1 Like

Yes I updated the setting to be more intuitive hopefully. I do need to find time to go back and fix up all the presets I just don’t have the time atm - one day though!

2 Likes

This never occurs. All 4 subpixels are never on at the same time. So if RGB subpixels are on white is definitely going to be off. In my most recent photo which included elements of the RetroArch Menu you could see this phenomenon clearly while the white menu text and lines definitely appeared pure white, which suggests that the white subpixel was being used to display pure white elements.

This also lines up with what LG has been saying about the use of the white subpixel to make the TV brighter since peak brightness is measured using pure white if I’m not mistaken and if a TV were to generate pure white from a mix of RGB then that would use 3 times the power and would definitely not be as efficient as using a pure white subpixel.

If I recall correctly the OLED’s in a W-OLED panel are all natively white before they get filtered into their RGB elements. It would make more sense therefore to have an extra unfiltered subpixel which will be able to perform at peak brightness due to a lack of light loss due to the filtering and especially when you would have to go back and mix three filtered primaries just to get back the white that you originally had from the get go.

Over time the information as well as my understanding of the structure of W-OLED panels has evolved somewhat. Initially, I remember reading that all of the subpixels were natively white but required a colour filtering layer in order to achieve RGB.

The reasoning at the time (and we can look at LG’s W-OLED patents to verify this) was that one of the main challenges in creating large OLED panels to be used as TVs was that different coloured (native) OLEDs aged at different rates.

This was acceptable for use in cell phones which people tend to hold onto for a much shorter time than the expected 10 year+ traditional replacement cycle/life expectancy of a TV.

LG’s solution to the uneven wearing was to use ALL white OLEDs and just filter them.

Due to their patent, they held onto a monopoly on this market for as long as they did.

In order to sidestep this patent Samsung came up with something slightly different. Use ALL blue OLEDs as the light source then use Quantum Dots to filter them into the various RGB elements.

Quantum dots have their own benefits, most notable of which is the brightness at which they glow which also contributes to the high colour volume they can produce.

3 Likes

No rush! I mostly asked because I’ll probably have to wait another year before upgrading displays, lol.

Have you played around much with BFI? Are you able to maintain CRT-like brightness while using it?

1 Like

So if the white subpixel is never turned on, and it is primarily responsible for adding HDR level brightness, does that mean the WOLED, when it comes to this shader’s current configuration, will be further hampered compared to QD OLED when it comes to brightness?

2 Likes

This might have to be measured and tested. It may not necessarily be hampered as all that is really needed is for a display to be bright enough and I think both W-OLED and QD-OLED displays have proven their ability to be.

Even some SDR displays might be bright enough.

Also, despite QD-OLED’s theoretical advantages, there might actually be some quirks which in reality could potentially hamper that technology vs W-OLED for example QD-OLED’s ABL which might be too aggressive or not optimally tuned and unable to be disabled vs W-OLED’s ABL and other brightness limiting burn-in mitigating technology, which up to now have been able to be turned off (at least of LG TVs).

Based on my observations in this thread, we have users who own both types of display who have been able to achieve a satisfactory experience.

What more could one ask for when you’re already using possibly the most accurate CRT Shader on the planet?

3 Likes

Yeah i would guess that when it comes to getting proper brightness it must still be getting activated with WOLEDs, might be interesting to test

1 Like

Proper brightness or peak brightness? Is it that you’re thinking that the other subpixels are incapable of producing “proper” brightness or at least getting very bright? Remember, LEDs of all types and colours have been around for a long time and they’re all capable or getting pretty bright. If W-OLED panels use all white OLEDs as a base with some of them being filtered to produce primary colours, don’t you think that all of those OLED subpixels will be at least almost as bright as one another, including the white subpixel?

The white subpixel being used for additional brightness, efficiency and purity of white doesn’t mean that when it’s not in use everything else is simply dark.

2 Likes

Just meant that it is likely activating to achieve up to spec brightness with regards to HDR content, so up to 800 nits peak brightness with the LG C2 I believe? Think otherwise it would peak at ~350 nits SDR, so nothing between that and 800. In retrospect think that when I brought that up it was a non-issue but I’m running on little sleep today

When it comes to burn in wonder how much the consistent demands for HDR brightness for the emulated phosphors will contribute towards it on either monitor, though at least it’s changing colour consistently. A little surprised also that the heat sink on the Sony QD OLED doesn’t appear to be helping much, it has actually been performing worse in these tests from what I have seen. I’ve read theories about how the pixel refresh cycles differing between various sets could be a factor but no real idea how valid that idea might be

2 Likes

I assume it turns them on for various light greys and off whites does it not? Or is it only for pure white? There are lots of highlights that arent pure white. So if so at what point does it switch off the rgb sub pixels and turn on the white sub pixel?

Obviously with our shaders WOLED is going to be quite a lot darker than what rtings peak brightness states as thats all based on pure white.

2 Likes

I’m really not sure, this is something that perhaps someone can setup and perform a test using a macro camera lens to verify.

I doubt it would only be used for pure white. In the example I gave it was using it for pure white because the menu in RetroArch was active and that doesn’t use RGB phosphor emulation while in the background, there was a game being emulated.

Except in the dark areas (which wouldn’t activate the white subpixel), the emulated content would have all RGB subpixels active, especially for the brightest scenes and for producing grey to white so the white subpixel probably won’t ever be activated at least using Sony Megatron Color Video Monitor. It might be if using another Shader with lots of “fake” err…synthetic glow, bloom or reduced mask strength though.

It’s interesting that after so long this remains somewhat of a mystery.

Just for clarity, it doesn’t switch off ALL the RGB subpixels once the white subpixel is on. It is designed to not have more than 3 subpixels on at the same time. So RGB, RGW, RWG, WGB can be on at the same time but never WRGB.

Any RTINGS photos showing all 4 subpixels active are composited from multiple photos superimposed on one another.

From the cropped screenshots below, it appears as though the white subpixel is being used for the font and the white line as well as the grey line in the RetroArch menu. I don’t see any of the coloured subpixels active here.

While those pairs of RGB dot columns are from the white box around the pilot in US Squadron - SNES. I don’t see any white subpixels active here.

3 Likes

So, if you take the shader out of the equation, do those same areas have the white subpixel lit? I would think step 1 would be to find some way to turn it on, then start seeing how to turn it off on command.

3 Likes