Sony Megatron Colour Video Monitor

lmao, yeah, it was hyper-budget back in 2018, so it’s laughably bad now. The HDR color is really blotchy, too–not specifically in the megatron shader, but just in general.

And yeah, there’s not really any appreciable difference between HDR and SDR brightness on it. It gets a hair darker with HDR, which is hilarious, but both modes are generally just too dark to be very useful unless I turn out all the lights lol

2 Likes

Didn’t know that 4k TVs with such low brightness exist!

2 Likes

If possible, using Special K’s HDR Retrofit is a much more functional, less risky option than this. That said:

Just to be sure you and anyone else who considers doing this are aware, that slider controls desktop/SDR paperwhite. 0=80 nits, and each tick on the slider adds 4 nits, so 100=480 nits. (It may be on a slight curve such that 100=500 nits in at least Win11. Reports vary.)

30 (200 nits) is on the high end for actual desktop use, and anything above that is more or less actively trying to burn your OLED, so you absolutely don’t want to crank it and leave it there.

You should also know Windows (currently) maps SDR into HDR using piecewise sRGB gamma, which will be incorrect for a great many things. If you have an 800ish nit peak brightness display, you can change this to pure power 2.2 gamma using dylanraga’s win11hdr-srgb-to-gamma2.2-icm color profile. (He also has a tutorial explaining how to generate profiles for alternate gamma levels and/or different peak brightness displays.)

(Also, last i knew, you could push the slider past 100 by editing the registry if you really wanted to, but the warning about burn risk apply even moreso at that point.)

3 Likes

Finally was able to make 480p content look nice on the slot mask after zeroing both attacks, horizontal sharpness at 1.00 and choosing 1080p/600TVL and 1.25 min & max, this combined with this NTSC shader that was decoupled from guest’s shader by GDPD1. Things actually aren’t super blurry while still being less aliased and the dithering is less noticeable. It’s dark even at max brightness because of my monitor, but still waay better than the blurry texture, macro blocky, very color bandy shaders I used before.

An image of my monitor(1440p) where the effect is better seen.

2 Likes

Thanks for the advice with Special K, I have to try it out. I use the method I mentioned only while playing games with the Megatron shader. When I use my PC normally, I always switch back to SDR and the Backlight of my OLED adjusted to around 100 nits. So I think the risk of burnin should be not a big problem, while playing games with variable content.

I also like, that the method of just using the Windows 11 SDR slider keeps the Gamut at rec.709 and not oversaturates colors as Windows 11 AutoHDR does as I just measured:

The white point is also a shifted towards magenta as you can see in the CIE digramm, but I found out, that the Min / Max scanline values of red, green and blue within in Megatron actually can be used to calibrate the greyscale / white point to a neutral 6500K etc. This works much better, than using the greyscale adjustments of my TV.

2 Likes
2 Likes

Yes I presume its happening in there too because theres pretty much nowhere else it can occur. The method I use is just a very old inverse reinhard tonemapping (I think! My memory fails me). There are very many ways to tonemap and hence to inverse tonemap. All have various problems and properties so pick your poison so to say. But yes we should look into implementing a more up to date version that fixes the above problems for our problem domain. Overshooting in the original problem domain doesnt matter much because itll be naturally clamped by the display.

1 Like

Yeah definitely start out with integer scaling and see if the problem exists there first and then move to non integer scaling afterwards - the shader should be fully non integer scaling compliant as it just looks at the original vertical resolution and fills out scanlines based on the output resolution.

2 Likes

Yes this is inherent: CRTs are actually searingly bright at the point of the scanning beam and then you have a decay of the excited phosphor. BUT most of the display on a CRT is either off or at low brightness - your eyes/brain is doing a lot of the work. On a OLED hardly any of the sub pixels are on: 3 in 12 and then only peak brightness is achieved for 1 or 2 pixels in 10 in the vertical direction for a 240p image displayed on a 4K screen. So its very dark for accurate crt emulation.

1 Like

That’s probably not a good idea as your colours will be probably be off: fundamentally your sub pixel red green and blue on your hdr tv are different red green and blues to a sdr tv. If you don’t compensate for that by turning on slightly other subpixels youre going to get wrong colours. In sdr mode your tv probably does this automatically so youre no worse. Depends on your tv of course and what its natural colour gamut is.

1 Like

Is that an OLED? Itll be very accurate colour wise - not sure on brightness though. If its an old oled bvm itll probably have bad lag. Amazing to have one though!

1 Like

Amazing looking forward to seeing photos!

1 Like

Fantastic! But yes sadly those nits arent enough - time to mod your tv! :rofl: take offall the limiters and keep it outside away from anything flammable. Youd need to only use it at the dead of night though.

2 Likes

Yes perfect! We’re only doing what Windows 11 has baked in so its only a backwards compatibility thing really. SDR will provide the same image as HDR with this method. I use it all the time on my mobile phone as its display doesnt need a hdr mode to get bright (presumably theyve completely hidden all the hdr nonsense from the user on android)

1 Like

Im not to sure burn in is that much of a risk with this shader - I could be wrong though so maybe worth ering on the side of caution. The reason I say that is that only 3 in 12 sub pixels horizontally are on and only 1or 2 out of 10 pixels vertically are at max. Presumably burn in is caused by heat due to surrounding sub pixels being on and pushing over an individuals subpixel heat limit over its individual limit and because most of the display is actually off with this shader the risk of burn in is presumably low. Its just my guess though I have no hard proof or tests.

1 Like

Superb I do like the ntsc look sometimes - it gives a different air to things. Much smoother gradients and the like but also a bit vaseline-y.

2 Likes

Yes personally I wouldn’t be too worried about burnin with this shader as the screen is mostly off/low BUT I can guarantee it so be careful using whatever method!

Also with the min/max/attack scanline that makes complete sense to me as we’re simulating 30 year old CRTs whose scanlines have invariably degraded over time and shifts the overall colour in different directions. My 2730qm has a stronger red scaning beam and so thats what I modelled in this shader. Setting all the scanlines to thevsame length and attack should fix that. But then thats those defects are now part of the crt look. What I will say is that on an old crt this changes across the whole screen - some areas green is stronger some blue so theres no right solution and I doubt many people have the patience to model it down to this level - famous last words… :rofl:

1 Like

Probably but for me this is a very important piece of the nostalgia pie. For others it might be reproducing curvature or blurriness err…vaseline-iness. Lol

They say ignorance is bliss so maybe it’s a good thing that I use CRT shaders exclusively and trust my memory and my imagination to determine what looks and feels right but I honestly don’t have any issue with colours using Mask Accurate Mode anymore with the shader.

On another note, you see that I was able to figure out how to capture hybrid HDR/SDR screenshots in jxr format using the shader.

My next challenge is how to record HDR video while using the shader but this is proving to be even more elusive.

On my system, I can’t use NVidia Shadow Play to capture HDR video, only SDR.

If I try to use OBS the only options that I’m being allowed to use to capture HDR are P010 (10-bit, 4:2:0) Colour Format and Rec. 2100 PQ or Rec. 2100 HLG.

Obviously this is unacceptable due to the Chroma Compression.

I think I might be able to work around these limitations by using the shader in SDR mode, then making my recording using RGB 4:4:4 or whatever Colour Format nVidia ShadowPlay uses.

I’ve never had a problem with ShadowPlay applying Chroma Compression to my SDR captures.

After making my SDR video, I would then like to apply the same principles and techniques of Inverse Tonemapping sRGB or (whatever SDR Colour Space Megatron Tonemaps from) into Rec. 2020 (or whatever HDR Colourspace Megatron Tonemaps into).

I would like to do all of this using video editing software for example, Premier Pro, DaVinci Resolve, Hit Film or Kdenlive.

Is it possible for one or more of you more learned folks here to assist me with some tips as to how I might go about this and if it’s even possible?

So basically I would like to start with Megatron in SDR mode but end up with a video at the end that looks like I recorded it using the shader in HDR Mode.

I’m including @Dogway, @Azurfel, @Dennis1 and @rafan in my list of persons who might possibly be able to assist.

1 Like

If youre on Windows 11 it might be best just to use auto hdr it provides and have the shader in sdr mode. Presumably OBS works with this feature but I dont know.

1 Like

Thanks for the tip. Ultimately I would like to have a properly inverse tonemapped from SDR to HDR file that is Rec. 2020 and HDR10 compliant so that when played back or uploaded to YouTube it would trigger the playback device to switch into HDR Mode and look similar to how it looks when using the shader in HDR Mode.

1 Like