Sony Megatron Colour Video Monitor

Not a huge amount but it is choosing a different mask based on your CRT screen type, target TVL and display resolution i.e aperture grille, 600TVL and 4K will choose a 4 pixel RGBX mask. Whereas aperture grille, 600TVL 1080p will choose a MG (Magenta-Green) mask and when I write that I see how it’s getting brighter: the MG mask turns on twice as many sub pixels as the RGBX mask and so is twice as bright.

What’s the peak nits of your display (as per rtings.com)? Just noticed there is a huge difference in brightness between SDR and HDR on your LG OLED C7V. Have you tried turning off any eco mode settings? Obviously are you definitely definitely sure you are using HDR (to be honest if you use the HDR presets this should be very obvious if you aren’t as everything will look washed out and terrible)

Just be careful with the Eve Spectrum - I had to wait ages for my monitor and although I think it’s one of the best monitors for retro-gaming the company is notorious for pissing people off with poor communication and broken promises. It’s more like a kick starter than a professional company. The monitor does look fantastic though both in its physical design and screen and it gets fairly regular firmware which is kind of good and bad as they fix some things whilst breaking others (I wish they just open sourced the firmware!)

Hi chaps just to say I got round to finishing off fixing the D3D11 and D3D12 drivers in TATE mode (it should be in last night’s nightly build). So shaders should work in those drivers for lots of arcades now. Why I felt this was important was that the HDR support in the DirectX drivers was better - supports more screens, better detection and reading back of the displays capabilities etc.

One main thing the DirectX drivers do was that we send HDR meta data to the TV in the DirectX drivers which in theory would allow the TV itself to decide how best to display the HDR content but has instead lead to differing results and Microsoft has pulled the rug on the idea so one main benefit may be a down side instead. I’m going to disable this for RetroArch but am not sure of the ramifications for the DirectX drivers so I’ll need help in testing.

Here’s the page talking about it:

1 Like

Well, the brown switching thing has gone away and I don’t know what I have changed, if anything.

Regarding the 1080-4k thing, I thought it was something really mandatory to have the 4k setting on a 4k monitor to actually experience the shader as it should. I’m pretty much sure that im on HDR, that is not the case for sure (I have been testing the SDR versions to notice the differences).

Regarding RTINGS: which is the setting that we consider “peak brightness”? On Rtings I read different settings but I’m guessing that we’re talking about the “HDR REAL SCENE PEAK BRIGHTNESS” to which Rtings says my monitor is 718 cd m2. Am I right about this?

I also guess that the Paper White brightness is more or less halfway?

I Believe Im going to leave this on the 1080p setting and enjoy how awesome it looks

Thank you for the help

1 Like

Yeah it’s not mandatory to use the same display resolution as you’re using it just relates to the TVL setting youre aiming for as in if you use a 1080p screen and 4 pixel sized mask you’re not going to be able to achieve a 600TVL as you don’t have the physical pixels to achieve that.

The shader for simplicity removes the need to know how large a mask is and simply gets you to state your display resolution and what target TVL you are wanting and then (essentially I choose in the shader) what is the best mask to achieve that. As you are doing you can choose what display resolution you want.

700nits should be more than enough to create a bright image with all masks - have you tried shoving paper white nits right up to 700 as well - it works well on my IPS LCD but not so well on my QD-OLED so your mileage may vary.

1 Like

Last night I released 5.1 of the Sony Megatron shader this is basically a performance fix for mobile phones, android TV devices and any other relatively low powered ARM device.

Here’s some pics of various masks using the default preset in SDR mode on my OnePlus 8 Pro (Snapdragon 865) using my wife’s iPhone (which isn’t the greatest when it comes to taking photos of bright screens but I digress). Best viewed with your device in landscape orientation.

2 Likes

Looks awesome. Is that SDR, then?

2 Likes

Yes! Just added that to the post. My phone is super bright and has probably the best screen I’ve seen in motion outside of a CRT - it’s got a super AMOLED I think. All we need now is BFI support in the mobile Vulkan and we might be there - my phones display supports 120hz too.

6 Likes

I got a chance last night to try out the shader on the Samsung S95B again and I managed to make some progress on Game Mode in that it now doesn’t scale the image.

What was going on was Windows behind RetroArch was swapping to 300% scaling for some reason. I just hadn’t known because I was in RetroArch and I must have tried all my changes inside RetroArch.

So this works a treat in terms of latency but I’m not sure it does much else for retro gaming.

What I really wanted access to is the display Black Frame Insertion which is termed LED Clear Motion. Sadly this is greyed out in ‘PC’ mode i.e when connecting your PC.

After a bit of reading I found you could fool the TV into thinking a games console was connected. Maybe that’s the wrong way to think about it - instead it maybe better to say I changed the mode of the hdmi connection from ‘PC’ to ‘Game Console’.

Regardless this enables the Clarity settings BUT at a major cost: chroma compression. Chroma compression destroys this shader because of the 100% mask it employs. I’ll post some photos to show but this is probably the reason half the people that use this shader gets poor results as it’s not obvious at all the TV is doing this in this connection mode - it’s only RTings mentioning that PC mode guarantees 4:4:4 colour.

Long and short it doesn’t seem you can have BFI with full colour 4:4:4 at the moment.

Reading across various sources I wouldn’t be surprised if this effects all Samsung TVs.

From what I can see though when it’s on motion clarity is comparable to my PVMs and it doesn’t appear to have that much impact on brightness but I can’t be sure because the colours are so screwed.

For the time being I’m falling back to RetroArchs BFI which has a dramatic effect on brightness but is more than useable on this bright display - it’s just not as bright as a CRT or with BFI off.

I’m going to try and follow this up with Samsung.

3 Likes

This could be due to the TV automatically compensating in the brightness department when BFI is enabled. My 3D TV does the same when 3D is activated.

You don’t get this if you feed the TV a native line interleaved 3D signal though and since this doesn’t need the TV’s 3D mode to work, there’s no brightness compensation.

So when you’re using software BFI through RetroArch, the TV isn’t aware that it’s receiving a BFI signal so it wouldn’t automatically increase the brightness.

Just a theory.

Yes I’m lead to believe this is a different algorithm BFI in that it’s the same idea as backlight strobing (that LCDs do) where a rolling bar of black goes down the screen. We can’t do this sort of thing from RetroArch as we have transfer whole frames over the hdmi cable where as the TV manufacturers have much more control especially OLED manufacturers. This is where I think the majority of the brightness differential would come from, although absolutely power saving stuff could be kicking in although I’ve turned off as much of that stuff as I can/makes sense because of our more extreme use case.

1 Like

Hi there. I’ve been trying this shader out but it seems to be broken for me on my CX. No matter what settings I change, I always get an extremely dim and washed out image.

Previously I’ve been using crt-aperture with the global retroarch HDR and it got very bright, even with BFI. Using megatron with the same HDR settings set in the shader parameters though can’t come close to the same brightness for me, and while it can be a bit brighter through messing with the resolution and other settings, its still way darker than my old setup, even with that running BFI and no BFI on megatron.

Tried taking a comparison pic but I’m not sure how well it shows off the issue. Hopefully I’m not treading old ground. Read through most of the thread and saw someone else with the same issue but it was never resolved.

2 Likes

Some level of darkening is to be expected, since it relies on the HDR brightness instead of blur/bloom tricks to light pixels when they technically shouldn’t be lit. It shouldn’t be washed-out, though, and that image looks to me like it’s not getting tonemapped properly. That is, it looks similar to how HDR stuff looks on non-HDR displays.

3 Likes

Just to add on top of what @hunterk just said (and putting BFI to one side for a moment) you shouldnt be getting a washed out image: a few possible causes are as Hunterk said youre not using a full hdr pipeline as in its not on in windows and or in retroarch and or in the shader parameters.

Another issue could be you are not using a full colour 4:4:4 output from your PC (either due to high refresh or gfx card not supporting it or wrong output port from gfx card or wrong cable or wrong input port on monitor or your PC/app is just set to have sub-chroma compression on).

Another smaller rectification is to use the RBGW display output type in the shader parameters - it should be marked ‘OLED’ but its doubtful it will cause washed out colours. Also make sure you are using ‘colour accurate’.

Also try playing around with paper white nits (once you have set peak nits to the correct value as specified by say rtings.com) - whack it right up to peak nits for example. Both in the shader parameters (you should ignore the values for this in RetroArch settings->video->hdr menu for this shader).

Also if you are thinking HDR isnt on then using the shader in SDR mode (again in the shader parameters) should fix any washed out colours.

4 Likes

Thanks for the advice. After doing some more tweaking, I’ve managed to get it looking much better. Biggest thing was changing the resolution in the shader parameters to 1080p, which does wonders for the image, but predictably loses detail with the shader itself.

Though while it looks nice, I still think something is wrong with my setup with regards to HDR somewhere due to an issue I forgot to mention before. I can’t seem to get perfect blacks in any scenario. Blacks will always come up grey no matter what is changed unless the image is darkened enough to be unseeable. Has anyone encountered this before?

2 Likes

How do you have your TV set up? Did you rename your HDMI Input to PC and are you using Game Mode?

Did your TV update itself recently? After updates the Black Level setting is sometimes reset to Auto. Perhaps you can try setting it to Low.

Your TV scaling or Aspect Ratio mode needs to be set to Just Scan.

What about your Brightness Settings or Power Saving mode?

Also be careful with the Sharpness settings on your TV and also in your Graphics Driver Control Panel.

Try switching back and forth between GPU and Display Scaling mode.

Lastly, do you have HDMI Deep Color enabled for your HDMI Input?

1 Like

All of my settings are like that, yeah. I should’ve clarified, I get perfect blacks with every other shader and in general. Its just the megatron shader that turns them to gray.

2 Likes

Yeah, I have had this same issue since the shader got updated some months ago. The image was perfect for me but then something happened and now everything looks washed out.

I’m using RetroArch on Xbox Series X so I have to force the HDR mode on from a secret menu on my LG CX OLED. RetroArch on Xbox doesn’t support HDR so I have to use this workaround.

1 Like

Thats definitely not right - using 1080p makes it better that is - Ive heard quite a few people report that. Usually its to do with using sub chroma compression but maybe its not and its some kind of tv post processing or something else is going on. I e heard quite a few people talking about screen interference when viewing high frequency patterns in older screens (theres a term for it but I cant quite remember right now) but part of me thinks this might be to do with compression techniques as well.

This shader is particularly sensitive to any kind of compression or post processing by the tv or any other part of the image pipeline because it uses 100% masks. I think if you turn off any blurs and have 100% masks in other shaders youd get the same thing happening - I would try to prove this theory but Ive never suffered this issue with any of the displays Ive used so sadly I cant.

Regardless lets ignore all that for the time being and just rule out HDR issues: turn off hdr in everything - windows, RetroArch and the shader parameters (by setting it to SDR). This will result in a darker image for sure but the colours should all be right and you should be able to use 4K and its associated fine grain masks perfectly fine (excuse the pun).

Then we can hopefully rule that out and progress from there. Id love to get to the bottom of this issue where ever it may lurk.

Obviously when using SDR its particularly critical to turn up the tvs brightness to max.

3 Likes

Also what device are using to feed the image? A windows 10 PC with a nvidia 3000 series gfx card say?