Rec 709 doesn’t have the issue as the primaries are the same colour and the same is true of sRGB. However I think a P3 monitor or any expanded gamut would do the same. Unless they fake it i.e the vice versa of what I said - turning on different sub pixels feom rec.709 primaries to get P3 primaries and adding extra brightness.
It would be interesting if an LG WOLED was actually a little better at creating standard RGB phosphors than a Samsung QD OLED, even with the white subpixel
I don’t understand. When you say standard are you talking abour RGB instead of BGR or are you talking about Aperture Grill instead of Slot Mask - which is what you’re seeing in the example?
BGR Aperture Grill works just as well as BGR Slot Mask.
Based on how those TVs work there are never 4 subpixels on at the same time so if you’re using an application that has the Red, Blue and Green Subpixels active then there shouldn’t be a need for the white RGB to be turned on and one of the other colours turned off.
Even if we’re using a mask pattern like RRBBGGX, which uses 2 pixels for one phosphor colour and we want to produce white, then the Red subpixel in the first and second pixel would be active with the blue, green and white turned off, the blue would be active and the others turned off for the 3rd and 4th pixel, the green active and the others off for the 5th and 6th pixel and for the seventh pixel all will be inactive.
This is all that is needed to produce white using a CRT Shader Mask. Why would there be any need for the white subpixel to come into the picture when you’re trying to produce red, blue, green and black?
As we can see from images I shared when the TV wants to produce white in the traditional way it can definitely use that white subpixel to have a pure, bright white.
Fortunately for us, CRT Shader Masks (pure mask simulations i.e.) don’t use white to produce white.
I’m just speaking from my personal observations and experiences by the way, not trying to provide any expert scientific advice.
Take a look at this image here to see what’s really going on with these WRGB subpixels.
I’m seeing blue, green and red adjacent to one another followed by the white subpixel which can be anywhere from white to grey to complete black. This represents an entire pixel. it might look look it’s wide and spread across a large area but do we really appreciate how tiny this one pixel is and how microscopic the distance across is from the leftmost subpixel to the rightmost?
Also take into consideration that as each subpixel is activated it will emit a glow. I have a feeling that unless you have some specialised equipment it would be near impossible to tell if the glow is coming from the left, right or center of the pixel.
That image was taken from this article below. It’s the only one I’ve found where they’ve composited it in a way that shows all of the OLED subpixels active at the same time.
This is another lovely photo showing WRGB OLED TV pixel structure taken from a website that you can subscribe to for the latest news and information regarding OLED and other emerging display technologies.
This is from a much older early Full-HD TV so the layout seems a bit different from what we have today.
https://www.oled-info.com/lgs-wrgb-oled-tv-sub-pixels-captured-macro-photo
That’s a much better photo, I’m still a bit disappointed in the black gap between the red and blue, but overall it’s not as bad as the first photos had led me to believe. The only other problem I’m seeing is that the subpixels are tuned to different RGB values, which @MajorPainTheCactus already mentioned.
Well if you’re referring to the gap between the red and blue columns of subpixels in the CRT Mask emulation, if there’s a discrepancy between the size of that gap and the size of the gap between the blue and the green columns of subpixels it’s very minute even with such high levels of magnification. At actual screen size and “normal” viewing distances for a screen of such a size, the disparity is even less apparent. In the more focused photos it’s harder to tell that there’s a difference in the size of the gap.
If you were refering to the OLED subpixel layout photo from RTINGS.com, then I look at it as just a BGR layout instead of an RGB layout with the white subpixel taking the place of the “X” or the gap that’s common between pixels even in LCD flat panel display pixel arrays. Do remember that the large black spaces in the RTINGS photo are turned off subpixels.
Well at least we’ve learned what might have been contributing to those strange stray subpixels being turned on in some of the pictures we’ve seen in the past. Maybe some fresh or out of the box thought on how best to mitigate this might be in order for the rest of 2023 and beyond.
It might involve some hacking or some culling of colors which might be triggering the extra subpixels. Maybe colour accuracy might suffer slightly but you might still be able to produce enough colours using the display’s subpixel primaries that it really doesn’t matter that much.
In any case this is also way above my pay grade. When it comes to certain things I sometimes come up with imaginary theories on how things might be able to be overcome.
The subpixels only matter as long as they’re visible to the eye, which is a function of viewing distance, pixel density and visual acuity (within reason; videophilia/audiophilia “golden” body parts notwithstanding). Beyond that threshold, we just draw the mask in whole pixels and move on. Based on Trogglemonkey’s calculations, we should be able to do that at around 6K res for non-weird CRTs (i.e., very high pitch).
Based on Apple’s “retina” display designations, it looks like a PPI between 200 and 400 at a distance of ~50 cm / 20 in is what we’re looking for. According to this chart, that should be doable with a ~40" screen at 8K or a ~20" screen at 4K. (obviously, 4K < 6K, so we’re not quite to Trogglemonkey’s threashold there)
There’s still the question of how well-rendered the mask needs to be (e.g., that high-res image of the CRT subpixels; do we need to see the rounded corners?), so there will always be room to turn one’s nose up And that’s without getting into the motion / refresh rate stuff.
But, if we overlay Cyber’s OLED subpixel shot on top of the high-res CRT phosphor shot, it looks pretty good to me, as far as spacing and layout (note: I only rotated his image to make it level and scaled it to match in size; I didn’t stretch anything to fit):
RGB vs RBG is obviously going to make some pixel-pr0n look off, but as with MajorPainTheCactus’ subpixel primary issue, how much that matters depends on what you’re trying to do (i.e., in his case, full-image color accuracy vs subpixel behavior accuracy).
@MajorPainTheCactus would you like for me to spin this out into another thread (or append it to one of the existing general subpixel-focused threads), since it’s fairly off-topic? Or do you care?
Im fine with it here but if you think itd help others find the information Im good with that too. Long and short I dont mind.
The main issue I see with WRGB is that the white subpixels job is boost brightness and without it being on, you’re left with a pretty dim display (comparatively to other technologies that is). This limits full mask usage and BFI type motion clarity.
See, I knew there was a downside. There’s always a downside.
It’s not all doom and gloom because there have been constant improvements to the technology aimed at boosting brightness by mitigating other side effects of pushing brightness, namely heat, which increases image retention risk.
One of these brightness improving technologies is the addition of heatsinks to the panel so that it runs cooler and can be pushed brighter with less risk of image retention. This was done last year.
This year we have the light boosting MLA or Micro Lens Array technology. This is a passive technology that focuses light that would normally be scattered and thus never reach the viewer’s eyes.
Notice @MajorPainTheCactus said
So this does not necessarily mean that current WRGB OLED TVs are not bright enough to give you a wonderful mask emulation with Sony Megatron Colour Video Monitor without the assistance of the white subpixels.
You bring up a really interesting point here - about boosting brightness by dissipating heat. My theory is that if a manufacturer knew a 100% mask CRT shader was being used and 90% of the sub pixels would be off for it Im wondering whether they could push the TVs brightness much higher than they currently do. I wonder if we can convince/bribe them to detect this. Mind you it might be also down to the individual sub pixel burning out. 🤷
I use RetroArch mainly with my Android phone nowadays and its got a 513ppi screen (although its relatively low res @1440p) and I do find it quite difficult to see the phosphors at times let alone the sub pixels. Thats also a foot away from my face. Although you wouldnt be able to get a full screen of it youd be able to prove out the 6K theory (I think)
I just love the way ideas just tend to flow one out of the other. This is an interesting point. Do you know that current LG TV users can access the service menu and disable most if not all of the brightness limiting and auto dimming “features” of their TVs by accessing the service menu?
At first you needed a service remote to do this, now there are a couple apps that allow you to do this via the networking features of the TV.
They’re called ColorControl and LG TV Companion.
I use them both for different things.
I use one to enable temporal dithering so I theoretically get better colour gradients. I really can’t tell the difference though.
I use the other to send a signal to turn off the TV when the computer is turned off or goes to sleep and to turn it back on when it’s woken up from sleep or the computer is turned on.
This enables the TV to act more like a computer monitor. Other then that, the default behavior is for the TV to turn off after a while, when the computer goes to sleep or puts the display to sleep but you normally would have to manually turn the TV back on.
Another thing I use them for is to turn the TV off regardless of the sleep settings of the computer. So the TV can be set to be turned off after a certain period of inactivity via the software itself.
After I repaired my TV, I actually had the idea to retrofit a heatsink plus some active cooling of my own onto the back of my TV panel. That idea kinda stalled though but it might actually be practical.
Maybe this is part of the reason I don’t have any issues with image retention even though I use my TV so much in 4:3 mode for emulation purposes.
Or maybe some kind of firmware hack?
Well given that you need about 2000 Nits for full slotmask emulation and BFI just to result in a constant 100 nits, I’m slightly less optimistic.
We’ll see. There’s a difference between sustained maximum brightness and peak brightness, and we can only hope that they don’t do something weird with the subpixels.
How do I increase the scanline width? I’m not talking about the parameters for R,G,B, but the scanline width overall.
There are three beams one for each colour channel and so to change the width of the overall scanline (the part that is lit) you need to change all three parameters which can be found under vertical scanline min and max’s. Min’s are used for darks max’s are used for lights. Does that help?
Hi @MajorPainTheCactus, I wanted to ask you a favor. I wanted to elaborate on your shader, but I use a Full HD 1080p monitor - could you recommend and shoot me a suitable configuration for this resolution? I know it’s not the best, indeed maybe it’s not enough, but I wanted to give it a try since your work is also found on Mega Bezels. A thousand thanks