Sony Megatron Colour Video Monitor

It’s LGs latest 42 inch ( OLED42C2)

2 Likes

Hi @MajorPainTheCactus

I’m loving the Megatron shader, been doing some more extensive testing with various cores and overal everything works great, but yes I did find one thing that could possibly be improved. Not a bug I think but more something feature related :slight_smile:

The bottom line is that interlaced content does not look right. For example there are many Playstation 1 games that mix low-res (320x240) with hi-res (640x240) and hi-res interlaced (640x480i) content, and this hi-res interlaced content does not look right.

I’ll explain a bit more thoroughly, hopefully it is something that you could look into.

I’m seeing two issues happening:

  1. Interlaced content does not look right as scanlines dissappear completely.

  2. What I would call “medium-res” (i.e. 640x240) is looking too sharp. I think this is because the shader “sharpness” setting is tweaked for low-res (320x240), but when a game switches to 640x240 in-game the horizontal filtering is not adjusted to this new resolution.

On issue 1 a bit of extra explanation may help. There are some cores that support real-time resolution switching just like the hardware did. Some cores that do this are for example the Mednafen PSX core, BlueMSX core, the Commodore Amiga PUAE core and some others.

As an example of the many oddbal resolutions that may occur here’s a video that lists all PS1 games that use interlaced as the base resolution - in the entire game - (he leaves out games that only occasionaly switch to interlace for title screens etc):

Playstation 1 ALL (?!) Interlaced (Hi-Res) Games

The resolution table that is mentioned in the readme “more” link below the video is interesting:

00:00 Intro
00:05 Hi-Res Mode?
00:35 Debunking 480i Myths
02:18 320x240
02:45 256x480
03:11 312x448
03:34 320x480
04:39 368x480
07:15 382x480
07:45 384x480
08:14 512x478
09:36 512x480
13:52 640x448
14:09 640x480

To not make this post too long and not knowing if you’re already familiar with the technicalities of CRT interlacing the summary is that for correct CRT interlacing simulation you need to weed out the odd and even lines fields from the emulator frames output and have the odd and even lines fields alternating at a slight offset from eachother.

Note that in a CRT the offset is about 0.5 lines. You and I know this is possible because it’s analog television and the phoshpors of the previous field have already died out when the half line offsetted other field is displayed. Rinse and repeat. Through persistence of vision the human eye and brain then sees a 480 tall picture. If you’re interested I could provide more info on CRT interlacing.

As an example this guy explains it nicely. It’s a simple explanation just using his hands and it’s for 1080p/1080i but the idea is the same for 480i CRT:

TV Explained: Progressive and Interlaced . The interesting part is from 3:34, at which the previous link starts.

I know guest.r has done a great job implementing interlacing into his advanced shader that’s virtuously indistinguishable from a real CRT. I’m hoping something similar could be achieved for the Megatron shader.

Possibly the de-interlacing.slang shader from HunterK in the misc folder could also provide some additional idea what’s happening, although this seems to be only for “normal” content and not directly applicable in a shader that has scanlines simulated etc…

I think this could really put the icing on the cake. As always thanks again for the superb Megatron!

2 Likes

Great investigation, thanks! Yes sure I’m guessing de-interlacing doesn’t work as I haven’t looked at it. I’ll definitely add to the to-do list.

My main focus at the moment with regards to CRT shaders at least, is to get TATE mode working for the d3d9, 10, 11 and 12 drivers. I’ve made some progress but there’s one more thing to fix before I can submit to GitHub. I hasten to add that’s nothing to do with the Sony Megatron shader but the drivers themselves.

4 Likes

I came across this page that tries to list all subpixel geometries that are used in-practice on modern screens, including phones. Not for any direct use, but since there’s been regular discussion on subpixel types I thought to flag it anyway:

Subpixel Zoo - A Catalog of Subpixel Geometry

Flagging this also to @nesguy and @Cyber for our resident subpixel gurus :slight_smile:

5 Likes

Just got an HDMI cable so I can use RA on my LG OLED CX! Any suggested settings to use? I’ve tried messing around with the HDR settings, “peak luminance” and “paper white luminance” but the image is still very dark.

HDR is working, I just can’t seem to find proper settings.

2 Likes

@datjocko seems to have gotten pretty decent results. Maybe they can share some settings?

Or maybe even @Wilch?

1 Like

Hi @e2zippo, hmm that’s interesting, as @cyber says some other users with that TV seem to be getting good/acceptable results - so there’s hope at least. If you use SDR is it even darker?

1 Like

I have LG CX too and the HDR picture is bright and awesome on Megatron.

I’m using RetroArch on Xbox Series X where you can’t normally use HDR but there’s a workaround: if you go to ’All Settings’ and press ’1, 1, 1, 3, 1, 1, 1’ while the ’Picture Mode Settings’ text is highlighted, you can access the ’HMDI Signalling Override’ screen where you can force HDR on by selecting ’ST2084’ for ’EOTF’.

But yeah, Megatron is awesome! Can’t really use any other shader anymore. Hopefully the TATE support comes out soon, can’t wait to try this on some shmups.

2 Likes

“The photo was to demonstrate the improvement to the mask on my OLED, but the overall image still had clipping/crush issues in the highlights and on similar colours unless using the JVC presets as described here: However, using the new colour-accurate parameter, all these issues are gone unless pushing the paper-white too high. Everything looks correct with paper-white at around 550 (nits at 720). Thanks for the amazing update!”

using the new colour-accurate parameter What is this and how to I use it? I seem to be having clipping/crush issues as well

Another thing I’m wondering is what settings are you guys using on the TV? @Scarf @datjocko @Wilch
Specifically Dynamic Tone Mapping? On / Off / HGiG?

1 Like

Hi @e2zippo make sure you’re up to date with Retroarch and then make sure you update your slang shaders. You do the latter by selecting a slang shader valid video driver - Vulkan/D3D12/D3D11 should all work - then go to the online updater option and ‘update slang-shaders’. Once you’ve got latest you should see an option in the shader parameters for ‘mask accurate/colour accurate’. Also you can choose a OLED sub element layout ‘RBG’ instead of ‘RGB’. Should work.

2 Likes

Hi @Scarf glad you’re liking the Megatron. So you should be able to play in TATE mode on the Vulkan driver - if you can use that (possibly not on an Xbox but PC definitely).

Here’s a few screenshots I’ve just taken - I have to say I bloody love it:

4 Likes

I’m all up to date. I just realized I’ve been doing stupid things. I’ve been trying to load the presets that come with @HyperspaceMadness Mega Bezel located in shaders\shaders_slang\bezel\Mega_Bezel\Presets\Base_CRT_Presets\Sony-Megatron\STD

They are all SDR, not HDR :grimacing:

Tried some of the shaders in shaders\shaders_slang\hdr instead that are called HDR and they look way better. No I can properly check!

Thanks!

5 Likes

Ah bingo! Great stuff hope this works better for you.

1 Like

Hmm, I’m still not sure everything is correct. Shouldn’t the image change when I change the luminance settings and contrast in HDR settings? No matter what I pick there, it all looks the same. I’ve tried switching to D3D11 as well instead of Vulkan just to see if there’s any difference, doesn’t seem like it though.

Edit: Never mind, I changed back to Vulkan and now the parameters in the shader seem to change as intended, now I think I’ve gotten everything to work! Like stated before if seems to look great with the following settings on my LG OLED CX.

TV Settings:

  • Make sure HDMI Ultra Deep Colour is on for the HDMI port used.
  • Dynamic Tone Mappping : Off
  • Oled Light: 100
  • Contrast: 100
  • Brightness: 50
  • Sharpness: 10
  • Colour: 55

Windows Settings:

  • HDR enabled in Windows and calibrated using the HDR calibration tool from windows store.

Retroarch Settings:

  • Vulkan as driver in Retroarch
  • HDR enabled in Retroarch
  • HDR Settings in Retroarch:
  • Peak Luminance 720
  • Paper White Luminance: 550
  • Contrast: 5
  • Expand Gamut: On

Shaders Parameters:

  • #reference “:/shaders/shaders_slang/hdr/crt-sony-megatron-jvc-professional-TM-H1950CG-hdr.slangp”
  • hcrt_colour_space = “2.000000”
  • hcrt_max_nits = “720.000000”
  • hcrt_paper_white_nits = “550.000000”
  • hcrt_expand_gamut = “0.000000”
  • hcrt_lcd_subpixel = “1.000000”
2 Likes

Sorry the slightly confusing thing (ok very) is that this shader because it is natively HDR overrides any settings in the RA->Settings->Video->HDR menu (apart from turning on HDR itself that is). Instead use all the options in the ‘Shader Parameters’ menu underneath Quick Menu->Shaders.

EDIT: obviously you’ve found the correct options under shader params. When you change those are you saying nothing happens? You’ve definitely got HDR selected in the shader parameters too?

1 Like

Just to say you should probably put your TV brightness up to max. However this may not work when using HDR as HDR is supposed to override local settings but just in case…

EDIT: you can also use the ‘240p Test Suite’ and it’s grey ramps to calibrate your HDR settings - this is different for every TV or at least model of TV.

1 Like

Here’s some GBA shots of a preset I’ll probably release soon:

4 Likes

With LG TVs it’s also important to change the label of the HDMI input that you’re using to match your input device. I have mine set to PC for example. I also use HDR (Game) colour setting.

You might want to lower your Peak Luminance value following @wilch’s advice.

You may also have to adjust your Black Level and Gamma settings for the Input. Power saving settings also significantly affect the brightness of the screen.

1 Like

Sorry the slightly confusing thing (ok very) is that this shader because it is natively HDR overrides any settings in the RA->Settings->Video->HDR menu (apart from turning on HDR itself that is). Instead use all the options in the ‘Shader Parameters’ menu underneath Quick Menu->Shaders.

Ah, that’s why nothing happened when I tried changing the settings there, quite confusing yes :slight_smile: Things do change when editing the shader parameters though.

Just to say you should probably put your TV brightness up to max. However this may not work when using HDR as HDR is supposed to override local settings but just in case…

Brightness is at 100 on the TV, or are you talking about some other brightness?

With LG TVs it’s also important to change the label of the HDMI input that you’re using to match your input device. I have mine set to PC for example. I also use HDR (Game) colour setting.

You might want to lower your Peak Luminance value following @wilch’s advice.

You may also have to adjust your Black Level and Gamma settings for the Input. Power saving settings also significantly affect the brightness of the screen.

@Cyber: Yep, I’ve changed the input to PC as well to be able to use 2160p@120hz. I think I’m using the same Peak Luminance as @Wilch. 720? Power saving is completely off, like you say it ruins the brightness.

When you say black level and gamma for the input, you’re talking about TV settings right? Black Level settings are Low/High/Auto where I have Auto. Gamma is grayed and can’t be changed, but looks like it’s at 2.2

2 Likes

Ah sorry I thought you said it was at 50% brightness on your TV.