I really hope I’m wrong. I think one of the main selling points of OLED (self-lit pixels) is working directly against us, here. I’m on the edge of my seat waiting for BBB to get us some new photos.
Me too I’m hoping his qn90a issues are resolved!
That’s why the solution or path to a more accurate result might lie in reverse engineering. The W-OLED isn’t magical. There’s an algorithm. The TV tries to display a particular image and adjusts the W-OLED according to its algorithm to achieve this particular output. I’m saying just by trial and error the output can probably be fudged until it looks the way we want it to look, white subpixel and all. The least that can happen is that we might learn a bit more about the behavior of the white subpixel algorithm by playing around with our output image a bit and observing how it responds.
We might at least get it to look better than it is now. If 99% perfection cannot be realized at the close-up zoomed in level, perhaps we might still be able to achieve 99.9% accuracy at 3 feet?
I think the algorithm is dead simple: once pixel hits X luminance start to fade up white sub pixel. Sure we can find out what luminance value it is and limit ourselves underneath that but the point of it is that OLEDs are too dark without that white element and certainly not bright enough for our purposes. It’s a catch 22 scenario I think, but by all means I could be throwing in the towel too early.
Even if you can resolve it, it still leaves the uneven sub pixel spacing which maybe isn’t as much of an issue as we might think.
Perhaps there might be ways to counter this by maybe increasing the saturation or tweaking the shape and spread of the phosphors a bit in order to lessen the washed out effect or just by declaring a hard limit on what level we will allow the white subpixel to fade up and hope it’s bright enough for all intents and purposes. Or just tweaking it to an optimal level where it’s just bright enough to be convincing but not bright enough to wash out the colours. So in other words we’re no longer trying to utilize the full brightness range in OLED TVs but aiming for the sweetspot between just or rather barely bright enough and not having the white subpixel spoil the party. We could treat this white subpixel itself similarly to how we would try to avoid clipping. There must be a sweetspot somewhere.
Giving up a little brightness overall may not be such a bad thing once you can see all of the details you need to see as in @BendBombBoom pics he shared recently where there was soo much more detail in the dark areas of the floor that could have been resolved in the OLED image VS the QLED.
Growing up I didn’t have any professional calibration done to my CRT and I preferred things to be a bit on the darker side and more saturated side but that’s just me.
At least we could start to tackle the problem from somewhere. Who knows where we’ll end up after a while?
i think uneven subpixels would matter more if it were a subpixel-dependent mask (e.g., magenta/green), but RGBX is only a little screwy with non-RGB layouts, since the gaps are already like 3/4 of a pixel apart.
Hmm, just realized my current setup may not be ideal at all for testing this shader.
So first of all, I normally output to my TV at 1080p through my PC’s DisplayPort into a DP-to-HDMI passive adapter into an HDMI cable, since for whatever reason, if I try outputting 1080p through its HDMI port directly, it automatically wants to output at 120Hz for some reason, which requires setting the swap interval to 2 to correct for it, whereas through DisplayPort, it caps at 1080p@60Hz (it doesn’t even let me use 4K), and I just figured it was easier this way. If I want to use 4K, I have to switch to the HDMI port.
Well, I just found out a few things. First of all, the PC’s HDMI port is version 1.4, which maxes out at 1080p@120 or 4K@30, but DisplayPort is version 1.2, which apparently DOES support 4K@60, and… it turns out the cheap HDMI cable I’m using is version 1.4, and very likely the cheapo DP-to-HDMI adapter is also to blame for not having 4K available through DP. So perhaps this rig may be capable of 4K@60 and I’m just mucking it all up. Furthermore, I cannot discount that this may have something to do with some of the issues I’m having with this shader. As such, perhaps take my feedback with a grain of salt until I can get a proper cable and output through DisplayPort properly.
Well I’m here to help once you are happy with things. I’m very interested in getting this working for you as I think this shader should work fine within limits.
I don’t think turning off the white subpixel is going to work for the same reasons @MajorPainTheCactus just cited. It’s worth a shot though, we should try to be as empirical as possible.
Just a thought, but perhaps there’s an option to turn the white subpixel off in the Service Menu? Though I’ve read for a lot of these newer TVs, you need a service remote to even be able to access it.
You need an Active Display Port to HDMI Adapter in order to get 4K@60Hz. You probably just have a passive one.
They aren’t that expensive, just do your research to ensure you get one that meets your requirements.
This doesn’t seem to make much sense unless you disabled or deleted resolutions using CRU or something. Perhaps this might be a good time to use DDU (Display Driver Uninstaller) and start with a fresh driver install and settings? You can also use CRU to reset all of the messed up EDID data you probably have in your registry.
Another thing you could try is Show Hidden Devices in Device Manager and delete all of the entries for Disconnected Displays that you might have in there.
Active DisplayPort to HDMI Cable 4K@60Hz HDR, CableCreation 8FT Unidirectional DisplayPort 1.4 to HDMI Monitor Cable, DP to HDMI Cable Support 4K@60Hz, 2K@144Hz, 1080P@144Hz, Eyefinity Multi-Display https://www.amazon.com/dp/B082CXMBCQ/ref=cm_sw_r_apan_i_RC4EJ747TF97G1CEHH72
Plugable Active DisplayPort to HDMI Adapter - Connect Any DisplayPort-Enabled PC or Tablet to an HDMI Enabled Monitor, TV or Projector for Ultra-HD Video Streaming (HDMI 2.0 up to 4K 3840x2160 @60Hz) https://www.amazon.com/dp/B00S0C7QO8/ref=cm_sw_r_apan_i_DXGYTP3B1PQGVG4M9PQW
BENFEI DisplayPort to HDMI, DP to HDMI Adapter(4K@60Hz) Compatible with HP, ThinkPad, AMD, NVIDIA, Desktop and More - Male to Female, Space Gray https://www.amazon.com/dp/B07ZNNRYFL/ref=cm_sw_r_apan_i_RJBSG5FNPH2F4CC2R6ZK
I’m not emphasizing turning the white subpixel off necessarily just manipulating the final output so that it looks good and possibly makes up for or masks the degradation caused by the white subpixel. If that means turning it off then by all means, it might also mean just calibrating the shader to keep things in a range where the white subpixel can help with brightness but not get in the way of saturation or at least not be as bad or noticeable as we have now so that overall things can be at an even more acceptable level.
If it’s possible to get the OLED to look like the shots that @MajorPainTheCactus has been posting, without color or brightness issues, and with well-defined RGB phosphors, then I’m all for doing whatever it takes.
None - kind of - I suppose it depends on what you define as a blur.
Basically I don’t use any kernel based blur or filter as there is no transfer of information/interference between scanlines (y direction) and as the beam only moves forwards there’s little if any transfer of information/interference backwards (x direction). It’s only forwards that information is important.
(That all may not be true in terms of the output circuitry of a console etc but in terms of the CRT itself it seems to hold true. I’m willing to be told otherwise but I don’t see that with component cables and my PVM at least.)
In that respect I just use a cubic bezier curve to interpolate between the current pixel and next which you could call a blur of sorts.
The reason why I use cubic beziers (for pretty much everything) is because they are fast and you can control them quite well.
I’m running Lakka on this PC, actually. It’s an Optiplex I dedicated solely toward emulation and nothing else. Thanks for the cable recommendations, though. I’ll look into it.
In the meantime, I managed to edit this shader so the 800TVL mask (which I suppose gets halved to 400TVL at 1080p) uses a CMY pattern, or rather YMC so it works better with my BGR arrangement. It looks VERY good on this display. Very bright even in SDR, sharp but not TOO sharp, and since it’s faster than guest-advanced-fastest (what I was using prior to this), I was able to turn down max swapchain images down from 3 to 2. All in all, I am quite pleased.
I do have to say that some way to introduce horizontal blur would be nice, as I find it works quite well for games with pre-rendered graphics like DKC to really blend the graphics, but honestly, it’s fine as is.
Wouldn’t RGB work?
xxRxGxBxx
YMC is cool, but kinda weird. I guess it would be a lot brighter. Does it look like RGB phosphors in a photo?
xGRBxRBG
Have you considered a Megatron-mini for 1080p SDR displays using all the brightness tricks available that don’t blur the image?
Magenta-green aperture grille and magenta green-checkerboard would work well. When I tried RGBX (on a 1080p SDR display), the black lines between triads were too noticeable even with the display brightness at max.
If you check out some of the screenshots I’ve recently posted, they’re decently bright on a maxed out SDR display, I think, and that’s with mask strength at 100%. I’m cheating by using glow, though.
I think work on Megatron should take priority, of course.
RGB does work, but keep in mind in my particular case I’m outputting at 1080p which my TV then upscales to 4K using some kind of interpolation, so because of that I can’t really take advantage of exact subpixel positions (the interpolation messes it up a bit), and if I can’t have accuracy, I might as well cheat a little and gain some much-needed brightness. And yes, somehow it does look like RGB stripes on close inspection, perhaps a bit washed out, but the brightness boost is completely worth it.
I don’t understand why upscaling from 1080p to 4K requires any kind of interpolation. Shouldn’t that just be a case of integer scaling/nearest neighbour scaling where 1 pixel would now be made up of 4 pixels?
1080p to 4K on my TVs result in perfect scaling with zero artifacts.
Are you using Game Mode + PC Input Label + Just Scan for the Aspect Ratio?
What about your graphics driver? Are you using GPU scaling or display scaling.
Something seems a bit off and not what is expected from your setup.