Sony Megatron Colour Video Monitor

AFAIK, it doesn’t have a GPU, so it’s not really running shaders (and therefore, can’t run the megatron or any derivatives). However, the thing that the megatron shader illustrates is that, once you get the brightness up via HDR, you can just draw scanlines and masks–which are simple enough to implement in FPGAs–without needing to do all of the complex blurs that other shaders use to preserve brightness.

4 Likes

Has anyone gotten Megatron to look bright enough on an OLED w/BFI? I was assuming you’d need HDR1000+ for Megatron if you’re also using BFI.

I think it’s impossible on OLED. It’s funny how much this feels like Plasma vs LCD all over again.

1 Like

not on my c2, too dim as well for my liking.

2 Likes

Of course it doesnt have a GPU since it uses an FPGA. You can technically render shaders(graphics) with CPU too.

I imagine the shader would have to be rewritten from scratch for the RT4Ks FPGA.

The RT4K FPGA is supposedly pretty powerful but I am not sure if its powerful enough for Megatron.

That said, tbh the dense triad(slot or dot) shaders in the RT4K look close enough to the 300TVL Mega/Cybertron to me so Im unsure if it would be worth the effort. But again that could be just me.

Shaders are a specific thing: programs that run on a GPU and are written in a high-level shader language or GPU assembly. FPGAs can technically be on a board that also has a GPU, just as they can share a board with, say, an ARM CPU (like the DE10-Nano), but that’s not happening with the RT4K, AFAIK. Yes, you can render shaders on a CPU using an emulated/soft-GPU like llvmpipe, but that’s not an option on the RT4K, either.

It would not have to be rewritten, its functionality would have to be reimplemented entirely, which is possible–as I mentioned twice–but at that point it is not really related to the megatron shader except on a conceptual level, which is: HDR for brightness, draw scanlines and mask, and AFAIK that’s already in-place on the RT4K.

@MajorPainTheCactus This whole tangent is pretty far off-topic, so let me know if you’d like me to spin it out into its own thread.

3 Likes
3 Likes

Semantics. That is more or less what I meant.

Ty Cyber for everything as well!

1 Like

So I had some time over the last day or so to investigate this problem where certain values end up outside of the rec. 601/709 gamut triangle i.e they are in wider colour gamut spaces.

This happens when extremely small values (extremely dark colours) go into the ST2048 PQ function. Due to presumably floating point inaccuracies it ‘warps’ these very dark colours out of rec.709 gamut.

You can ‘fix’ this by adding a clamp(colour, 0.000001, 1.0) before feeding the colour into the HDR10 transform i.e add this line:

scanline_colour = clamp(scanline_colour, vec3(0.000001f), vec3(1.0f));

at line 1669 in crt-sony-megatron.slang.

This comes at the cost of technically non zero blacks albeit it probably rounds down to zero black.

Because this has no real impact to the final image as these darks are just too dark to perceive that they are slightly outside the gamut, I’m going to leave the shader as is. People can use the Windows 11 AutoHDR if they really want a solution to this but I’d defy anybody to see the difference (in fact I’d really like to know if you can and more importantly what scenario). I presume that Windows 11 AutoHDR is using higher precision and/or a higher precision power function and is why its able to resolve this issue properly rather than using a clamp.

5 Likes

Excellent news! This definitely explains why we needed the analysis tools to see the issue in the first place.

I will experiment with that clamp fix at some point, but by the sound of things i very much doubt that i will see anything that you didn’t.

It’s worth noting that even if Win11 AutoHDR is (technically) more accurate in this very specific regard, it’s still using a piecewise sRGB decoding gamma for SDR to HDR conversion, which is incorrect for literally everything older than the mid-00s, and is only correct for a smattering of PC exclusive games past that point.

Incorrect decoding gamma would be waaaaaaaaay more noticeable than near-black gamut overshoot.

2 Likes

Ah interesting I havent had the chance to upgrade to win11 to try this out. One thing I would say is that we can transform into sRGB gamma space in SDR using existing shader settings and so Windows 11 AutoHDR would be able to do the correct transform into rec. 2020 - would it not? As I say I havent got win11 to try this so I could be missing something.

2 Likes

I found I was able to select Vulkan again if going into the Mupen Core options and selecting ParaLLEI-RDP option for RDP Plugin

3 Likes

I was struggling to get a PC source port of Zelda Ocarina of Time (Ship of Harkinian) working with this shader via reshade, it uses DX11 but in the end the AutoHDR add-on caused the game to crash on startup. I managed to get HDR functioning via Special K and the shader seems to be playing along decently, are there any known issues with this kind of implementation that I should be aware of?

1 Like

So Megatron/ReShade AutoHDR and Special K aren’t (inherently) doing the same thing with HDR.

At least with more modern games, Special K is recovering and presenting the native/true HDR information that a game was rendered with internally before it was converted into SDR for the final presentation.

Megatron and ReShade AutoHDR are instead converting the SDR image directly into an HDR space and then cranking the brightness to 11 to offset the brightness lost as a result of using a full strength phosphor mask.

But i suspect you can set up Special K to do what Megatron is doing given how ludicrously versatile it is? And that may even be what it does by default for games that aren’t internally rendered in HDR? (Assuming more recent versions of Harkinian aren’t rendering in HDR internally? I haven’t looked at what has been added for a bit now.)

3 Likes

Actually stopping to think on it, is Win11 AutoHDR is even fit for purpose?

Like, it’s a machine learning “AI” based blackbox attempting to mimic what a native HDR presentation for the material in question would look like, rather notoriously trained on the presumption that all games utilize the same realist “AAA” art style (or at least trained with those sorts of games in mind as it’s primary use case.)

Compared to AutoHDR, one could (probably) get more accurate results by directly displaying the SDR image in HDR using the “HDR/SDR brightness balance” slider as our paper white (with registry edits for higher values), tho that would also clamp the colors to rec 709, and that piecewise sRGB decode would still have to be compensated for somehow.

I know that the current color profile based solutions wouldn’t work for correcting the piecewise sRGB decode, as they also effect the final HDR image.

If Special K can be set up to use HDR solely to boost the brightness of the original SDR image, that may actually be the best course. I’ll have to look into this some more…

4 Likes

Blimey is it really using machine learning? I would’ve thought itd just use a calibration screen on start up and be done with it. I mean yes you might want to get a more artistic look but then it doesnt know the intention especially for any new games. Sounds a little over engineered to me - sounds like bored developers trying to jump on the A.I. bandwagon. :joy: I suppose there isnt a good solution so why not. 🤷 (Except for gpu cost and expense of electricity).

How does special K know where to inject (hdr tonemapper) and what to skip/replace - lots of games wrap/embed tonemapping into the final upscaler for instance.

2 Likes

Yep, part of Microsoft’s desperate search for a proper nail to go with their fancy new AI hammer.

In fairness to them, AutoHDR does by default enforce a whitelist of games it has been (hypothetically) tuned for. It’s a portion of the community that has decided to force it more generally.

I’m not sure right off. Probably best to go straight to the source and search/ask on the Special K discord. I suspect they would be pleased to have you around in general given that both Megatron and your “AutoHDR” ReShade plugin come up for discussion on occasion, and they take a certain pride in being a one stop shop for HDR discussion. (Tho personally i still have very mixed at best feelings about discord overtaking forums for these sorts of discussions.)

2 Likes

Yes, I have a big problem with this, as well. It’s all down the memory hole these days. To be fair, discord is to IRC what reddit is to forums, but reddit is becoming harder to search now, too.

2 Likes

Is it possible to have Slot Mask and Shadow (Dot) Mask Presets that look like this currently in Sony Megatron Color Video Monitor (not including the overlay/bezel/background)?

These all look immaculate on my WOLED TV in terms of CRT Mask structure.

I would really like to be able to do these using Sony Megatron Color Video Monitor.

2 Likes

Set TVL to 600, all three Scanline Min settings to 1.00, and all three Scanline Max settings to either 1.00 or 1.50.

You can also try Mins at 1.50 with the Maxs at either 1.50 or 2.00, but that is probably be a bit more diffuse and bloomy than you are looking for.

2 Likes