Has ABL been an issue on the S95B when using the Megatron? All I can say is the photos you posted a while back look fantastic. Have you encountered any issues aside from the mask/color problem (which would affect any recent TV)? Do highlights pop the way they should against a dark background, etc? (this was an issue on the QLEDs as I recall)
I see here that you’ve done some interesting things to the settings, while the SDR presets have very different values, resulting in a sub-optimal image. Any plans to update the SDR presets? I think the Megatron would reach a much wider audience that way (although it’s really meant to be experienced in HDR)
Also I think the shader has been updated quite a bit since these photos were taken, so the same settings shown here won’t necessarily produce the same results.
Ok so is that when the white sub pixels get turned on - when all three RGB sub elements are on? This would line up with what Ive seen in your various photos I suppose (as in I never see the white sub element on). Doesnt that mean that with anything that uses near 100% masks will be really quite dark? I suppose what we need is a colorimeter to test what the luminance is as I suspect we’d get a lot of unexpected values out of various displays with these shaders.
No ABL hasnt as far as I know but Id be really quite interested in what unhinging the brightness would do (if you could do it) with the Megatron. As for QD-OLED itself its pretty good perfect blacks and little light bleeding - its just the weird triad layout it uses but it doesnt really effect these shaders at least for slot and aperture grille - I dont think Ive really tried dot masks.
Yes I updated the setting to be more intuitive hopefully. I do need to find time to go back and fix up all the presets I just don’t have the time atm - one day though!
This never occurs. All 4 subpixels are never on at the same time. So if RGB subpixels are on white is definitely going to be off. In my most recent photo which included elements of the RetroArch Menu you could see this phenomenon clearly while the white menu text and lines definitely appeared pure white, which suggests that the white subpixel was being used to display pure white elements.
This also lines up with what LG has been saying about the use of the white subpixel to make the TV brighter since peak brightness is measured using pure white if I’m not mistaken and if a TV were to generate pure white from a mix of RGB then that would use 3 times the power and would definitely not be as efficient as using a pure white subpixel.
If I recall correctly the OLED’s in a W-OLED panel are all natively white before they get filtered into their RGB elements. It would make more sense therefore to have an extra unfiltered subpixel which will be able to perform at peak brightness due to a lack of light loss due to the filtering and especially when you would have to go back and mix three filtered primaries just to get back the white that you originally had from the get go.
Over time the information as well as my understanding of the structure of W-OLED panels has evolved somewhat. Initially, I remember reading that all of the subpixels were natively white but required a colour filtering layer in order to achieve RGB.
The reasoning at the time (and we can look at LG’s W-OLED patents to verify this) was that one of the main challenges in creating large OLED panels to be used as TVs was that different coloured (native) OLEDs aged at different rates.
This was acceptable for use in cell phones which people tend to hold onto for a much shorter time than the expected 10 year+ traditional replacement cycle/life expectancy of a TV.
LG’s solution to the uneven wearing was to use ALL white OLEDs and just filter them.
Due to their patent, they held onto a monopoly on this market for as long as they did.
In order to sidestep this patent Samsung came up with something slightly different. Use ALL blue OLEDs as the light source then use Quantum Dots to filter them into the various RGB elements.
Quantum dots have their own benefits, most notable of which is the brightness at which they glow which also contributes to the high colour volume they can produce.
No rush! I mostly asked because I’ll probably have to wait another year before upgrading displays, lol.
Have you played around much with BFI? Are you able to maintain CRT-like brightness while using it?
So if the white subpixel is never turned on, and it is primarily responsible for adding HDR level brightness, does that mean the WOLED, when it comes to this shader’s current configuration, will be further hampered compared to QD OLED when it comes to brightness?
This might have to be measured and tested. It may not necessarily be hampered as all that is really needed is for a display to be bright enough and I think both W-OLED and QD-OLED displays have proven their ability to be.
Even some SDR displays might be bright enough.
Also, despite QD-OLED’s theoretical advantages, there might actually be some quirks which in reality could potentially hamper that technology vs W-OLED for example QD-OLED’s ABL which might be too aggressive or not optimally tuned and unable to be disabled vs W-OLED’s ABL and other brightness limiting burn-in mitigating technology, which up to now have been able to be turned off (at least of LG TVs).
Based on my observations in this thread, we have users who own both types of display who have been able to achieve a satisfactory experience.
What more could one ask for when you’re already using possibly the most accurate CRT Shader on the planet?
Yeah i would guess that when it comes to getting proper brightness it must still be getting activated with WOLEDs, might be interesting to test
Proper brightness or peak brightness? Is it that you’re thinking that the other subpixels are incapable of producing “proper” brightness or at least getting very bright? Remember, LEDs of all types and colours have been around for a long time and they’re all capable or getting pretty bright. If W-OLED panels use all white OLEDs as a base with some of them being filtered to produce primary colours, don’t you think that all of those OLED subpixels will be at least almost as bright as one another, including the white subpixel?
The white subpixel being used for additional brightness, efficiency and purity of white doesn’t mean that when it’s not in use everything else is simply dark.
Just meant that it is likely activating to achieve up to spec brightness with regards to HDR content, so up to 800 nits peak brightness with the LG C2 I believe? Think otherwise it would peak at ~350 nits SDR, so nothing between that and 800. In retrospect think that when I brought that up it was a non-issue but I’m running on little sleep today
When it comes to burn in wonder how much the consistent demands for HDR brightness for the emulated phosphors will contribute towards it on either monitor, though at least it’s changing colour consistently. A little surprised also that the heat sink on the Sony QD OLED doesn’t appear to be helping much, it has actually been performing worse in these tests from what I have seen. I’ve read theories about how the pixel refresh cycles differing between various sets could be a factor but no real idea how valid that idea might be
I assume it turns them on for various light greys and off whites does it not? Or is it only for pure white? There are lots of highlights that arent pure white. So if so at what point does it switch off the rgb sub pixels and turn on the white sub pixel?
Obviously with our shaders WOLED is going to be quite a lot darker than what rtings peak brightness states as thats all based on pure white.
I’m really not sure, this is something that perhaps someone can setup and perform a test using a macro camera lens to verify.
I doubt it would only be used for pure white. In the example I gave it was using it for pure white because the menu in RetroArch was active and that doesn’t use RGB phosphor emulation while in the background, there was a game being emulated.
Except in the dark areas (which wouldn’t activate the white subpixel), the emulated content would have all RGB subpixels active, especially for the brightest scenes and for producing grey to white so the white subpixel probably won’t ever be activated at least using Sony Megatron Color Video Monitor. It might be if using another Shader with lots of “fake” err…synthetic glow, bloom or reduced mask strength though.
It’s interesting that after so long this remains somewhat of a mystery.
Just for clarity, it doesn’t switch off ALL the RGB subpixels once the white subpixel is on. It is designed to not have more than 3 subpixels on at the same time. So RGB, RGW, RWG, WGB can be on at the same time but never WRGB.
Any RTINGS photos showing all 4 subpixels active are composited from multiple photos superimposed on one another.
From the cropped screenshots below, it appears as though the white subpixel is being used for the font and the white line as well as the grey line in the RetroArch menu. I don’t see any of the coloured subpixels active here.
While those pairs of RGB dot columns are from the white box around the pilot in US Squadron - SNES. I don’t see any white subpixels active here.
So, if you take the shader out of the equation, do those same areas have the white subpixel lit? I would think step 1 would be to find some way to turn it on, then start seeing how to turn it off on command.
This should be a trivial test to perform. I’ll give it a go when I get a chance and we’ll see what happens when we take off the mask and the shader.
Mask/Shader Off
Mask/Shader On
Hi there. Yesterday I found out about the Sony Megatron shader and I was blown away by how much it resembled a real CRT shader. Also saw the Dead Cells vid on Youtube and in motion too it looks fabulous. So today I’ve been trying to implement it in reshade. I’m using “ReShade_Setup_5.8.0_Addon” so that I can also the AutoHDR addon. I was able to enable that succesfully within reshade. Regrettably, with the Sony Megatron.fx shader I keep getting a red “failed to compile” message in my shader overview.
To be clear, I’m using reshade with standalone version of MAME (so no retroarch) and either of the direct x (9 or 10/11/12) API’s function with reshade. When I use Open GL or Vulkan I cannot open reshade and no folders are automatically created within my MAME folder where the mame exe is. But either of the Direct X’s do work. Maybe this is relevant info, maybe not.
I downloaded the Sony Megatron Github files (9 files in total), with one of them being the SonyMegatron.fx shader and I extracted all files to:
LaunchBox\Emulators\mame 0.249_HOR\reshade-shaders\Shaders\SonyMegatron-ReShade-main
with ‘SonyMegatron-ReShade-main’ being the main folder where those files are in.
In reshade I open the settings tab and add the directory for this folder under the effects section as I assume otherwise reshade has no idea it has to look there for the Sony Megatron shader.
The 7 ini files I place in the root directory of where the mame exe is located which is the ‘mame 0.249_HOR’ folder also mentioned above. But those 7 ini files also kept in the folder under reshade-shaders\Shaders\SonyMegatron-ReShade-main. Maybe this is relevant info, maybe not.
Now, the error message I get is [SonyMegatron.fx] failed to compile and the full message is visible in the screenshot.
basically: error X3535, bitwise operations not supported on target ps_3_0.
And I know that that line is telling me to search row 1557, as the error is there, but I’m not knowledgeable enough to figure it out.
Sorry for the somewhat longish post, but I’d like to be thorough.
Can anyone please clarify what might be going on? Thanks
I believe that error indicates that it’s trying to use directx 9 but requires functions that are not available in dx9’s “shader model” version. Try using dx10+ instead and, if it still fails, see if it’s a different error and, if so, post it here and we can take a look.
Be aware that somebody already did a port to ReShade. I haven’t tested it a lot, but it functions (sort of). There should be some talk about in this thread.
Hi there, I did the entire process all over again, but instead selected direct x 10/11/12 as the API in the initial reshade setup process. The error persists though.
Is there a way to copy and paste the entire error message from within reshade? In any case, attached the error message, very similar to the one I posted earlier today. Only the characters after the word ‘shader’ seem different, but starting from (1557,14-28) it’s 100% the same.
if I can do more to help clarify the issue by all means let me know.
I have to emphasize once more that I don’t use retroarch in this particular case, but rather standalone mame.
Thanks