Sony Megatron Colour Video Monitor

I prefer to say, each emulated CRT Phosphor triad.

Thanks for finally articulating this clearly because that’s exactly what is going on and it’s a miracle that it happens to align properly for slot masks, e.t.c…

and this very precise accident is why BBGGRRX or XBBGGRR doesn’t align properly with RWBG/WOLED displays.

Since this is the dawn of RGWB tandem OLED and new subpixel layouts, can you provide a simple blueprint or tips for adding new layouts or updating existing subpixel layouts in the near future?

It’s unfortunate that no one with an LG G5 has come forward to assist in testing and ensuring that they are getting properly aligned subpixel mask emulation but I guess ignorance is bliss and no one really cares much or respects the value of getting the CRT emulation right from the building blocks except us few.

For what it’s worth, I downloaded a new nightly a couple days ago and converted all of the presets in my latest Epic preset pack which is based on Sony Megatron Colour Video Monitor v1 and from the 1 or 2 presets I tested, they looked pretty much identical to how they looked before the conversion.

I might still need to reinstate my custom modifications for some of the 8K masks to get additional 4K Masks at different heights that I liked but due to the changes in the Mask code that you introduced, I think I would try to see how they would look with the new Mask code before deciding if they need any further modification.

The reason I did that in the first place was because I preferred the stubbier 8K Masks over the more elongated 4K ones but then the 8K ones were no longer RGB/RBG/BGR.

1 Like

Of course it matters: if our native colour space was HDR10 we also wouldn’t need to do anything and we’d have to do something to convert it to sRGB/r709. Games have to do this through tonemapping etc Look lets stop arguing about whether the sky is blue - it is.

1 Like

I don’t get what you mean by this. In all three cases, HDR, WCG, SDR, I have to do a conversion. I don’t get anything for free because the source data is in a different space (let’s just call it linear RGB).

I understand that when we have a final shader pass in 10-bit and an 8-bit swap-chain, we have to convert to 8-bit some how. That’s totally acceptable. But we are all saying that a 10-bit SDR swap-chain is available (albeit not to everyone), and therefore the 10-bit to 8-bit conversion in SDR should not need to happen when we could make use of that swap-chain. Further, we’ve seen how this invisible pass insertion has created bugs and user unfriendliness, so from a maintenance standpoint it makes sense for the program to try to match the swap chain with the desired format if at all possible.

Look lets start again: we select our swap chain via the Retroarch menu. Custom shaders ask for a desired render target format via the preset or #pragma format. The two requests may mismatch and we have to decide how to cope with that. In all cases you will need to output the swap chain format.

You seem to think (I could be wrong) that adding support for 10bit SDR would help with this: it wont you will still need to support HDR swapchains in your shader. In fact this will make things more difficult and no one will use it as we have HDR.

Probably poor wording on my part. Adding 10-bit SDR support (let’s just call it WCG even if that’s not perfectly correct) is a feature request. With the current detection to trigger inverse tonemapping when HDR is on, WCG shader needs to use conditional to output either gamma-corrected signal (and ends up being 8-bit anyway because RA won’t use a 10-bit swap chain if HDR is not enabled) or HDR10 signal. I think that’s acceptable, and it’s a ‘problem’ that is totally independent of whether 10-bit output happens or not. I actually just want to have WCG for its own sake. I think the confusion stems from the earlier discussion about 10-bit/16-bit format triggering HDR10 assumption unexpectedly.

The colors look wrong to visual inspection, not the analysis tools. The analysis tools just help determine what has changed and what might be going wrong.

What should be pure primaries (red, green, or blue) now fully activate multiple colors of subpixel, and all colors look look substantially different from the shaderless output (with everything set to Rec709/Accurate mode).

Actual CRTs didn’t (intentionally) activate multiple colors of phosphor on pure primaries, as is demonstrated by this post from the CRT photos thread.](Calling all CRT owners: photos please!)

RetroGames4K’s image from Rockman X3

Megatron 2.6.2 (Colour Accurate mode)

Megatron 2.7.0

Actual CRT gamuts were literally determined by the physical color of their phosphors. I suspect that the mismatch between our target color gamut primaries and our simulated phosphor primaries is at least part of the issue here.

For the record: i absolutely do not ever use AI tools intentionally, as i have yet to be convinced they are in any way fit for purpose. Insofar as i have been forced to interact with them against my will, they have done the absolute opposite of impress me, especially Google’s AI Overview, which has been wrong in some way or another literally every. single. time. it has come up whilst i was looking something up.

I remember Alien: Isolation having 10-bit SDR support because it was discussed regularly when the game came out, nearly a year before HDR10 was even announced.

(I recognize that many software developers have been impressed by AI assisted coding, and i acknowledge the possibility that it might have a real niche for that purpose specifically, but even there i remain extremely skeptical given the studies suggesting that people regularly think that AI saved them time when it actually took them longer.)

Getting things ready for Sony Megatron Colour Video Monitor v2!

This is a tribute to the shader devs who made this all possible. Know that your hard work is not in vain and is greatly appreciated!

CyberLab Megatron miniLED 4K HDR Game BFI Turbo Duo_DC Composite Sharp PVM Edition Epic CAR9x7x or CAR7x6x W4.slangp

CyberLab Megatron miniLED 4K HDR Game BFI Turbo Duo_DC Composite Shadow Mask Epic CAR9x8x or CAR7x6x W3.slangp

CyberLab MegatronV2

CyberLab Megatron V2 Presets incoming…

For those who say, they don’t know which preset to choose or they’re so many, this is why I say, read the thread.

CyberLab Megatron miniLED 4K HDR Game BFI SNES Composite PVM Edition Epic W5.slangp

CyberLab Megatron miniLED 4K HDR Game BFI SNES Composite Sharp PVM Edition Epic W4.slangp

Introducing CyberLab Megatron miniLED 4K HDR Game BFI Turbo Duo_DC S-Video CyberTron Epic CAR9x8x or CAR7x6x W4.slangp

Hope more and more folks are viewing these on their properly calibrated HDR setups now and you should definitely be viewing them at 1:1 scale or zoomed in. SDR users you need to brighten your display in order to view properly.

Bonus Content:

2 Likes

Are you updating your Online Shaders as well after updating to the new RetroArch Nightly Builds when testing? Can you also provide the presets when showing examples so that others may test and confer?

1 Like

I download the new versions of Megatron directly from the github repository and place them into versioned folders for easier testing and modification (hdr262, hdr270, etc.).

Both the 2.6.2 and 2.7.0 images are crt-sony-megatron-v2-default-config.slangp with the RWBG subpixel layout, 670 for both Luminance settings, 300 TVL, and both Colour System and Phosphors set to neutral/0.

@MajorPainTheCactus did change the scanline min/max/attack and beam sharpness/attack settings for that preset between the two, but those have no effect on what i was showing in those images (all three primaries being fully lit when they shouldn’t be.)

“Mask Accurate” always did the same thing on at least LG OLEDs, that is part of the reason i have always told people with LG OLEDs to use “Colour Accurate” (that is, “Mask Accurate” moved the simulated phosphor primaries to the Rec2020 primaries, caused all three phosphor primaries to fully light up even when they shouldn’t, and made colors notably inaccurate.)

1 Like

Although it appears we’ve lost those later images a bug very much has crept in - its basically in the conversion from XYZ to r2020 but Ive not been able to pin point the fix so far - I knock other values out, a bit like whack a mole. Bear with me it will be fixed.

4 Likes

V2.7.1 Sony Megatron Shader

So this release adds support for scRGB as in 16bit HDR. As I’ve said in a previous post I’m not sure this is actually better than HDR10 because the hardware/driver still quantises down the signal to 10bit (rec2020 PQ) to send over the wire. But it is better if you’re using the DWM compositor i.e Steam overlay etc. So depending on your use case you may or may not see a benefit.

What this does show though is that you can have one shader that supports all three swap chains that RetroArch provides. Which is a good thing in keeping things simple.

I’ve also fixed the colour space menu option as the naming wasn’t making much sense and DCI-P3 and Adobe colour spaces were broken (r709|sRGB and r2020 were swapped around but working). This now also makes a bit more sense in the passes: SDR pass output SDR and the HDR pass outputs HDR (except for the PQ).

Anyway hopefully this fixes a few issues. I’m still a little concerned about the scanlines warping the colours (there maybe bug(s) still) but its also worth bearing in mind that a CRT also warped colours when the three beams diverged i.e transitioned (at different rates) to the next colour across - but this would be confined to small areas and the overall colours would be correct.

4 Likes

Does it work on Vulkan or is it D3D only?

1 Like

Both! Just be aware if you’re using it on a phone or tablet with a high resolution that performance maybe effected. Id recommend sticking with HDR10 on anything that is power sensitive. Also if you want to achieve super high frame rates on integrated graphics again Id recommend HDR10. If you have a discrete gfx card fill your boots and go for scRGB as basically it doesnt matter what you do with Retroarch on them.

2 Likes

Just to report the current state of things on Linux, since “GLSL/HLSL/Vulkan: Fix ExpandGamut colour boost and HDR support flag” (in https://github.com/libretro/RetroArch/pull/18772) was merged, the option to enable HDR now appears in the AppImage build, so I was able to test the current state of things.

Now in the current nightly build in Linux, the colors are washed out when HDR is enabled, both in the UI and in the game.

Screenshots (current nightly build in Linux AppImage):

Before, it looked like this (stable build in Linux Flatpak):

I don’t know when this (new?) issue arose, since before this last PR, the option to enable HDR didn’t appear in the AppImage build, not even when I compiled the code myself, it only appeared in the Flatpak build, which is still stuck in stable.

Enabling or disabling HDR in the system doesn’t change this. Is this still due to the missing metadata, or maybe is a new bug on the Linux side?

2 Likes

Unrelated to the above, but I’m facing a problem and decided to ask here to see if anyone can point me to the source of it, but I’m almost certain the problem is with the monitor rather than the shaders.

When using shaders with a strong mask, such as Sony Megatron or CRT-Guest-Advanced with mask at 100% strength, I see an effect similar to moire, but only in blue areas in the game, where vertical lines with an lighter blue area appears. Even when I move the image to the right or left, they remain in the same place. It’s a little hard to see in photos, but I edited one of the raw images to point out each vertical line I identified in it, the raw images are below it:

!

While searching about this, I found some comments here about something similar, including this one by @Cyber (Sony Megatron Colour Video Monitor), but it don’t seem to describe the same thing I’m facing, so I’m not sure. Is this due to my current monitor?

I plan to buy a better one, maybe in some months, but before it I want try to understand in some way what’s going on here to avoid facing the same thing in the new one I’m going to buy.

Edit: my current monitor is a cheap LG UltraGear 24” IPS. Any hardware here where I live is very expensive, so the new one probably won’t be a premium one either, that’s why I want try to know what’s going on before buying another one.

2 Likes

Ok thanks for letting me know - that looks like HDR hasnt kicked in. Have you got shaders turned off just to make sure they aren’t effecting things for the time being? So in your HDR menu options are you seeing Off, HDR10 and scRGB? I suspect something is broken when it cant support scRGB and so it turns off all HDR.

EDIT: ignore my question about the menu option - its in the screenshot now I look closer!

2 Likes

This looks great @Cyber! Looking forward to it!

1 Like

If youre able to grab a screenshot and then look at the pixel values in a paint package - see if theyre there in the raw image or whether its just your monitor.

1 Like

Have you tried changing the Colour Gamut/Colour Space in the Shader Parameters?

Try Rec.709 or sRGB instead of Rec.2020.

2 Likes

Yes, there were no shaders loaded when I tried it in the nightly build, I tried it on a live system with clean retroarch.cfg to make sure nothing would interfere.

EDIT: ignore my question about the menu option - its in the screenshot now I look closer!

But I really was seeing HDR Off, HDR10 and scRGB as options.

This cheap monitor barely supports HDR10 (it claims to, but all it seems to do is increase the brightness), but I made sure to test all the options, as well as the Colour Boost options.

2 Likes