Sony Megatron Colour Video Monitor

I also have a C1 and have found Megatron to be sufficiently bright with Retroarch’s internal BFI set to 1 and the C1’s BFI off.

RetroArch’s internal HDR settings should be disabled when Megatron is active, so that is normal. Megatron’s shader parameter settings not working is odd tho.

Just to verify, setting the Megatron’s shader parameter paper white to 0 isn’t making game screen go black?

2 Likes

No problem and thanks for sharing. When you said you “believe this to be the case”, I thought you might have taken the next step and done some further testing and provided us with some photos for example.

That mask layout came from CRT-Guest-Advanced. It’s Mask Layout 1 (BGR).

These are the kinds of real world experiences that I like to read about. I could just imagine how buttery smooth that must be.

2 Likes

hello I have a friend who will lend me a Sony BVM-HX310 so I will be able to do a lot of tests so I would like to know if it is a good screen for megatron shaders or if there really is no point in doing tests on such a screen

1 Like

Of course this should be a good screen for the Megatron Colour Video Monitor Shader. The only thing is @MajorPainTheCactus is busy these days so he might not be able to respond to your request in a timely manner.

As for me, I don’t mind if you put my Sony Megatron Shader Preset Pack for a spin and give me some feedback.

You have to at least take some high quality photos and video of it in action though! This could be a real opportunity to showcase what this Shader is truly capable of. Feel free to ask for some recording/photography tips.

Remember your room should be completely dark with the camera properly stabilized. Use Manual, Fixed Focus, Fixed WB and 1/60 Shutter Speed. Set an ISO that will properly represent the real world brightness and black levels (so not too bloomy looking) when played back and lower your ISO if capturing from close up so that we can see the correct RGB phosphor colours without the bright colours being overblown (overexposed).

Remember to set the Mask Layout to match the layout of the display.

2 Likes

I have a tripod with Nikon D7500 I think for the photo quality we will have no problem

2 Likes

Wow! Now I’m looking forward!

In the meantime, anyone ready for some HDR screenshots of Sony Megatron Color Video Monitor in action?

You need to enable HDR in Windows 10 or 11 and use the photo viewer for this to look as intended. If HDR is not enabled, you’ll still see a good quality image but it won’t have the same brightness as what you see during gameplay with Sony Megatron Color Video Monitor running in HDR mode.

https://mega.nz/file/FExXyKbQ#HBKrAf5FTLU8fng-bjOIXjMU6rlYm9etJu28RKEmAsw

https://mega.nz/file/wZhQxIZS#FSJ4CNdmLSflMfMoCmhAcjf2q9w6FA1WUhVY8gIb5HM

I used nVIDIA Shadow Play to take these but there are other methods available.

The files can also be viewed on an SDR display by the way.

1 Like

I bought a mini PC to do some HTPC work, and it happened to come with Win11 and its internal IGP supports HDR, so I actually got to try the Megatron shader for the first time. Unfortunately, it really demonstrated how awful my TV is at HDR :sweat_smile: It’s a TCL 4 series from like 5 years ago.

I was able to get the color and dynamic range dialed in by setting the peak luminance to ~250 nits and the paperwhite luminance to ~140 nits. In a totally black room, this looked pretty decent with the 19" low-TVL Sony PVM preset to maximize the lit subpixels, though the dismal brightness means I missed out on any natural eyeball-based bloom. I still got to enjoy the pixel-pr0n, though :slight_smile:

4 Likes

I found a way to be completely independent of Windows AutoHDR or the internal HDR tonemapping the Megatron shader uses. The advantage is, that this method can be used with Megatron reshade and any emulator, regardless which API it uses, which was a problem before.

It is very simple, just turn on HDR in Windows 11 and then drag the SDR Brightness slider to the maximum on the right side:

Then in the Megatron shader make sure to set it to “SDR” and “SDR: Display’s Colour Space” to “r709”, which ensures, that the gamma looks correct and not washed out.

On LG OLED TV’s you can also activate Dynamic Tonemapping on top, which makes the picture even brighter.

With this method, you are completely independent of any HDR tonemapping from the shader and the picture with the activated Megatron shader is much brighter than in SDR.

4 Likes

Congrats! I know you’re not a noob when it comes to these things but are you sure that TV can’t get brighter than that?

Do you mind sharing the model number so that I can research it?

Also, on some TVs it’s okay to go really high with the Paperwhite brightness, even higher than the peak.

Higher TVLs can also look brighter like 800 and 1000.

1 Like

I believe it’s the Walmart spin on this one: https://www.rtings.com/tv/reviews/tcl/s-series-s405-4k-2018

3 Likes

Wow! How low could they go? Based on the RTINGS info your best bet might be to use the Shader in SDR mode.

SDR Brightness

HDR Brightness

You’re getting about 10cd/m² more brightness out of SDR Mode than HDR Mode.

Maybe you can disable the power management or its possible that the picture mode they tested wasn’t the absolute brightest.

1 Like

lmao, yeah, it was hyper-budget back in 2018, so it’s laughably bad now. The HDR color is really blotchy, too–not specifically in the megatron shader, but just in general.

And yeah, there’s not really any appreciable difference between HDR and SDR brightness on it. It gets a hair darker with HDR, which is hilarious, but both modes are generally just too dark to be very useful unless I turn out all the lights lol

2 Likes

Didn’t know that 4k TVs with such low brightness exist!

2 Likes

If possible, using Special K’s HDR Retrofit is a much more functional, less risky option than this. That said:

Just to be sure you and anyone else who considers doing this are aware, that slider controls desktop/SDR paperwhite. 0=80 nits, and each tick on the slider adds 4 nits, so 100=480 nits. (It may be on a slight curve such that 100=500 nits in at least Win11. Reports vary.)

30 (200 nits) is on the high end for actual desktop use, and anything above that is more or less actively trying to burn your OLED, so you absolutely don’t want to crank it and leave it there.

You should also know Windows (currently) maps SDR into HDR using piecewise sRGB gamma, which will be incorrect for a great many things. If you have an 800ish nit peak brightness display, you can change this to pure power 2.2 gamma using dylanraga’s win11hdr-srgb-to-gamma2.2-icm color profile. (He also has a tutorial explaining how to generate profiles for alternate gamma levels and/or different peak brightness displays.)

(Also, last i knew, you could push the slider past 100 by editing the registry if you really wanted to, but the warning about burn risk apply even moreso at that point.)

3 Likes

Finally was able to make 480p content look nice on the slot mask after zeroing both attacks, horizontal sharpness at 1.00 and choosing 1080p/600TVL and 1.25 min & max, this combined with this NTSC shader that was decoupled from guest’s shader by GDPD1. Things actually aren’t super blurry while still being less aliased and the dithering is less noticeable. It’s dark even at max brightness because of my monitor, but still waay better than the blurry texture, macro blocky, very color bandy shaders I used before.

An image of my monitor(1440p) where the effect is better seen.

2 Likes

Thanks for the advice with Special K, I have to try it out. I use the method I mentioned only while playing games with the Megatron shader. When I use my PC normally, I always switch back to SDR and the Backlight of my OLED adjusted to around 100 nits. So I think the risk of burnin should be not a big problem, while playing games with variable content.

I also like, that the method of just using the Windows 11 SDR slider keeps the Gamut at rec.709 and not oversaturates colors as Windows 11 AutoHDR does as I just measured:

The white point is also a shifted towards magenta as you can see in the CIE digramm, but I found out, that the Min / Max scanline values of red, green and blue within in Megatron actually can be used to calibrate the greyscale / white point to a neutral 6500K etc. This works much better, than using the greyscale adjustments of my TV.

2 Likes
2 Likes

Yes I presume its happening in there too because theres pretty much nowhere else it can occur. The method I use is just a very old inverse reinhard tonemapping (I think! My memory fails me). There are very many ways to tonemap and hence to inverse tonemap. All have various problems and properties so pick your poison so to say. But yes we should look into implementing a more up to date version that fixes the above problems for our problem domain. Overshooting in the original problem domain doesnt matter much because itll be naturally clamped by the display.

1 Like

Yeah definitely start out with integer scaling and see if the problem exists there first and then move to non integer scaling afterwards - the shader should be fully non integer scaling compliant as it just looks at the original vertical resolution and fills out scanlines based on the output resolution.

2 Likes

Yes this is inherent: CRTs are actually searingly bright at the point of the scanning beam and then you have a decay of the excited phosphor. BUT most of the display on a CRT is either off or at low brightness - your eyes/brain is doing a lot of the work. On a OLED hardly any of the sub pixels are on: 3 in 12 and then only peak brightness is achieved for 1 or 2 pixels in 10 in the vertical direction for a 240p image displayed on a 4K screen. So its very dark for accurate crt emulation.

1 Like