Is that from the 240p test suite’s white / RGB screen test?
Yes, it is 0 R, 0 G, 255 B
Bingo we have our problem! Now what on earth is causing that? First thing I can do is try repeating that on my screen. Maybe you can try playing around the temperature setting in the shader params and see if that removes the problem? I will say it’s in the blue spectrum being up at 2800 - I’d expect to see red when using negative numbers. Can you also check your TVs white balance/temperature settings and make sure they are neutral i.e 6504K at least to see if it helps?
Just tried it on my display and I do indeed get that pattern (see top photo) but possibly to a lesser degree. Changing the white point (in shader parameters) back to 0 as in 6500K gets rid of the red (see bottom photo). However the white point of 9300K is right for the 2730QM. I’m wondering whether your QN90A’s white point is also not neutral over correcting the white point?
Photos are taken with a white point of 6500 on my phones camera.
Hi so just to be clear you’re saying when you set the output to 1080p you don’t get the “wierd bright patterns stretching into the letter boxed parts of the image” but in 4K you do? You say this is particularly bad when high tvl shadow mask presets are used but then kind of say this isn’t due to the shader in your last post. Sorry I’m a little confused as to where we’re at.
I’m noticing another issue with the OLED closeup, which is that the black gaps between triads are quite pronounced where the brightness is low and that’s not something you’d see on an aperture grille. The “phosphors” aren’t blooming enough at low brightness, probably due to the low brightness of the OLED itself and the fact that all the pixels are self-lit… with any other application this would be a benefit, but for our purposes here we really want the subpixels to bleed/bloom into the black gaps between active subpixels.
I zeroed out my white point and it’s pure blue. If I go into positive number and it adds a little green, negative it adds red (a lot of it.). My QN90a is set towards a 6500k white point. Very interesting to see.
Although you could well be right, I’d hold fire on whether OLEDs bloom as much until we use the same camera in a side by side comparison.
Once we resolve @BendBombBoom qn90A issues possibly BBB is best placed to tell us - at the very least by just eyeballing it.
Correct, at 1080p I don’t have issues since at that resolution, the patterns cannot get as fine as they can at 4K since the TV upscales to 4K afterwards. It also happens with high TVL aperture grille and slot masks, but the shadow mask is the worst. Not a HUGE problem for me since I prefer lower TVL masks anyway, which don’t seem to have these issues, but still thought I’d point it out in case others come claiming that it’s a problem with the shader when it likely isn’t.
I was referring to a separate issue, again, completely unrelated to the shader since it happens regardless of shader or even without shaders altogether. Basically the image gains a greenish tint if there’s a lot of black around the borders of the image, so say I enter a small room in an RPG and the only thing pictured is the small room centered in the middle of the display, which results in big black borders. In that one instance, the colors will shift, and will stay that way until I pull up a menu or leave that room so I produce an image that fills the screen, at which point the colors instantly go back to normal. I don’t know why this happens, if it’s some weird processing the TV does, but nothing I do fixes it, so eh.
Ok so that’s great! Does it look ‘right’ now? So one thing to hear in mind with this is that an actual sony PVM 2730qm has a white point of 9300K so in order for you to match that white point with the shader’s white point set to 0 is to set your TVs white point to 9300K.
What I will say is if that then causes us to explode out in magenta as in the above problem then there’s something wrong with either the gamma or luminance settings in the shader (or even possibly in the TV but let’s not go there yet)
Ok understood - with regards to the first paragraph the problems exhibited at 4K probably exist at 1080p it’s just the TV is blurring them in the upscale.
Maybe I’m wrong and it’s more to do with your VA panel but it might be worth trying to iron issues out at 4k and then go back to 1080p for playing purposes - it may improve your image quality quite a lot. I have a va panel here and I’m yet to try this shader on it - I need to try! I’ll report back any interesting findings.
I didn’t think that was a daft question to be honest. The reason professional content doesn’t suffer from issues is because it doesn’t break out channels into individual pixels as we do here so the problems are much more subtle although arguably still there.
Can this be simulated (aka faked) in an extremely accurate way? This doesn’t have to be a universal solution but one that can be designed to work within the strengths and limitations of OLED TVs.
Instead of lamenting the W-LED isn’t there some way in which we can work with them, harness them or work around them to achieve the end goal? Perhaps some creative, out of the box thinking and reverse engineering? Maybe modifying the original theories being used to develop the shader to cater for the limitations and nuances of the current display technologies at hand while at the same time not compromising on the screenshot comparison accuracy!
The real issue with WOLED for our purposes is that we don’t have control over that white sub pixel - it’s controlled by LG. If we had some level of control over it we might be able to do stuff but we can’t.
With regards to what nesguy is talking about it may or may not be an issue as a) there’s a camera involved and b) there’s two different cameras involved. We’re a little way off I feel from being able to prove and or quantify this issue yet. I could well be wrong though I usually am.
I really hope I’m wrong. I think one of the main selling points of OLED (self-lit pixels) is working directly against us, here. I’m on the edge of my seat waiting for BBB to get us some new photos.
Me too I’m hoping his qn90a issues are resolved!
That’s why the solution or path to a more accurate result might lie in reverse engineering. The W-OLED isn’t magical. There’s an algorithm. The TV tries to display a particular image and adjusts the W-OLED according to its algorithm to achieve this particular output. I’m saying just by trial and error the output can probably be fudged until it looks the way we want it to look, white subpixel and all. The least that can happen is that we might learn a bit more about the behavior of the white subpixel algorithm by playing around with our output image a bit and observing how it responds.
We might at least get it to look better than it is now. If 99% perfection cannot be realized at the close-up zoomed in level, perhaps we might still be able to achieve 99.9% accuracy at 3 feet?
I think the algorithm is dead simple: once pixel hits X luminance start to fade up white sub pixel. Sure we can find out what luminance value it is and limit ourselves underneath that but the point of it is that OLEDs are too dark without that white element and certainly not bright enough for our purposes. It’s a catch 22 scenario I think, but by all means I could be throwing in the towel too early.
Even if you can resolve it, it still leaves the uneven sub pixel spacing which maybe isn’t as much of an issue as we might think.
Perhaps there might be ways to counter this by maybe increasing the saturation or tweaking the shape and spread of the phosphors a bit in order to lessen the washed out effect or just by declaring a hard limit on what level we will allow the white subpixel to fade up and hope it’s bright enough for all intents and purposes. Or just tweaking it to an optimal level where it’s just bright enough to be convincing but not bright enough to wash out the colours. So in other words we’re no longer trying to utilize the full brightness range in OLED TVs but aiming for the sweetspot between just or rather barely bright enough and not having the white subpixel spoil the party. We could treat this white subpixel itself similarly to how we would try to avoid clipping. There must be a sweetspot somewhere.
Giving up a little brightness overall may not be such a bad thing once you can see all of the details you need to see as in @BendBombBoom pics he shared recently where there was soo much more detail in the dark areas of the floor that could have been resolved in the OLED image VS the QLED.
Growing up I didn’t have any professional calibration done to my CRT and I preferred things to be a bit on the darker side and more saturated side but that’s just me.
At least we could start to tackle the problem from somewhere. Who knows where we’ll end up after a while?
i think uneven subpixels would matter more if it were a subpixel-dependent mask (e.g., magenta/green), but RGBX is only a little screwy with non-RGB layouts, since the gaps are already like 3/4 of a pixel apart.