Sony Megatron Colour Video Monitor

Part of it is the subpixel structure, but another huge interest of mine is nits while running BFI, tied with pixel response times in general. Once we get larger format QD displays, for my use case I’ll be sitting far enough away for the mask details to be less apparent, but I’ve been on a huge motion clarity kick lately and I’m wondering how it will compare. I currently have an LG C8 OLED and man the BFI just isn’t cutting it for me, too much brightness loss and the perceived flickering is a little too high. But if a QD monitor and kick up the nits it might be what I’m looking for

More than the mask, I really really want that CRT-esq smooth motion clarity and the sort of light phosphor decay you’d see on most sets. Maybe one day soon!

3 Likes

@MajorPainTheCactus @Nesguy @Cyber @BendBombBoom

I also have an LG C1 and I can take over the task of testing this shader on it and taking photos. Unfortunately, the best camera I have is a OnePlus 7 Pro. It has a macro mode that I hope is enough to capture subpixel elements though. You can check an example of my photos here: New CRT shader from Guest + CRT Guest Advanced updates

I don’t know much about photography, so if you want specific photo settings, please instruct me on what you would like to see.

Now I’m off to set up RetroArch on Windows and check this shader out in HDR for the first time. :slight_smile:

4 Likes

Firstly thanks for your kind words. I probably wouldnt frame it like this though, as in replacing CRT’s with LCD’s, at least not for the time being. How I’d frame it is in getting access to CRT screens (that are hopefully as close as we can get) that we otherwise wouldn’t have access to.

As in who has the space, money and time to collect all the different models of PVM? Let alone Sony Trinitron, JVC Professional etc and a thousand and one arcades. I’m hoping recreating these CRT screens on LCD’s opens the screen style to a much wider audience who otherwise dont have the time, patience and money to own an actual CRT. I for one didn’t fully appreciate the full spectrum of CRT screens before I started - each is beautiful in its own respect.

Having said all that I totally hope we see your laundry list of features soon. My Eve Spectrum has a version of what youre saying in 1. - Blur Busters implemented it. Brightness is nearly there - certainly some screens should be bright enough. Also a lot of CRT’s didn’t actually get that bright - I’m looking at you TMNT, Simpsons arcades. I think 4K is enough as you’re mostly simulating 300 and 600 TVL monitors.

Pretty much we just need emissive display tech with RGB sub pixel structure as advertised in the QD-OLED promotional material at CES2022. :man_shrugging:

4 Likes

I think BFI is a very brute force approach to simulating pulse driven displays. Hopefully backlight strobing will see wider support and hopefully it gets to the level soon that approaches CRT motion clarity.

I dont think we’re that far off. I dont think we need stupid refresh rates either (I know blur busters do) as I think the problem is much simpler than reproducing frames of a game at a ridiculous speed. All we really need is the display itself to pulse the LCD elements and an LED/LCD can have multiple sets of virtual beams unlike a CRT so the problem becomes much easier. OLED’s should get us there due to their response times but obviously hasn’t yet for whatever reasons I’m yet to understand.

3 Likes

Thanks @nfp0! The more merrier - I’d love to see more photos of this shader taken on different peoples setups and configurations as it’ll help me debug and understand things. Please post whatever you have although it is important to use proper camera settings and post them along side the images. ISO 100 or ISO 200 (I’ve been using the latter), white point D65 (ie 6504K) and 1/60 aperture speed (or whatever refresh rate youre using)

3 Likes

Hello,

do you recommend some references of pc monitor ? I’m looking for a 32 inch screen (flat) that would be compatible with this shader and if possible in 1440p to not bring my gpu to its knees. Thank you.

2 Likes

Hi, the first question I have is what GPU do you have? This shader is one of the most efficient and will be faster at 4K than most other shaders at 1440p.

Resolution is key for TVL - 4K minimum for 600TVL, 1080p minimum for 300TVL. I’d get a 4K monitor that’ll go as bright as you can afford.

1 Like

No problem! Sure, I’ll try and use those settings.

I seem to be doing something wrong though. I’ve tried Windows 10 with HDR on with my RX 6800 and I only managed to get it to work on D3D11. D3D12 gives me unusable green garbled graphics and Vulkan doesn’t show the HDR options on RetroArch (I’m on 1.10.1). Windows and graphics drivers are up to date.

D3D11 is the only one that worked but gives me this with the crt-sony-megatron-bang-olufsen.slangp. The colors look even more washed out in real life. And the grey ramp is all out of whack. All other presets are also looking wrong.

2 Likes

Yes that’s a classic sign that HDR isn’t in fact working for you. As a test try going into the shader parameters and switch into SDR mode. See if that brings back all the correct colour. Also the green garbled look would be a telltale sign that some kind of chroma compression is going on. Have you got the right cables for your display and ports? What connection type are you using displayport or hdmi? What spec of each are you using? Vulkan not showing HDR option is another sign something isn’t quite right with your setup as in it cant detect HDR back buffer support.

EDIT: you might also be able to rule out Retroarch by looking at text in Windows and seeing if it’s rendering poorly/looks blurry/has a red or blue fringe

2 Likes

I’m not sure if Vulkan HDR not working is pointing to an issue with his setup.

It seems there’s an issue with Vulkan and HDR on newer AMD cards. This was also mentioned in the guest shader thread.

I have a RX 5700 and have the same issue as @nfp0: the HDR options don’t show after setting driver to Vulkan.

HDR works fine with D3D11, so that’s what I’ve been using.

3 Likes

Looks even worse.

I can see HDR is getting triggered correctly on the TV, so the signal is getting there. The rest of Windows also looks correct. Also, I’ve played Resident Evil 2 in HDR mode on Windows and it worked correctly.

I’m using an HDMI 2.1 cable. Output on Windows is set to Full RGB 10 bit @ 120hz with VRR enabled.

Everything looks right with the correct colors on Windows. The RetroArch menu also looks right. It’s only when I enable the shader that it goes bonkers.

The only settings I’ve changed on Retroarch was the driver and enabling exclusive fullscreen.

2 Likes

Yes I kind of would argue that’s an issue with @nfp0 setup as in there’s an issue with AMD graphics cards. What we’re trying to find out is if it’s a problem specific to the shader or a wider problem with the OS, drivers, monitor etc etc.

1 Like

Have you definitely checked all the colour settings are on ‘full colour’ in Windows display settings? Hmm another thought is maybe that display has an irregular sub pixel layout? Could you try BGR instead of RGB in the shader parameters. Also could you take a really close up shot of the display so I can see whats going on with the subpixels. Use the SMPTE-C plunge bars - red green and blue and get a shot of in between the bars so there is a bit of white, blue and green or red and green if you see what I mean.

EDIT: sorry just noticed you said it’s got full colour on

Is it alright in windowed mode then? Also you need neutral sharpness - that maybe 0 or 50 usually dependent on your display. Also just to test try 60hz with VRR disabled.

Mine advanced display settings looks like this:

There is an info button the C1, I think it’s the 3 dots and it should tell you resolution, refresh rate, and if it’s in HDR. From my experience you need a GPU with a hdmi 2.1. I have a laptop with a 2060 and while I can turn on HDR in windows 10, it looked significantly worse than my 3080 with a 2.1 output.

1 Like

@nfp0 as another test and further to @BendBombBoom comment can you try in SDR mode - turn off all HDR both in Windows and RetroArch and turn on SDR in the shader. You may need to play around with the brightness, contrast, saturation and gamma settings to bring up the brightness if its particularly dark. This will rule out some sub pixel issue.

I’m looking for a 1440p monitor because my pc is mainly used to play recent games that require a lot of resources.

1 Like

Today I went back to the JVC Professional TM-H1950CG simulation as @DariusG had very kindly donated a better image than I had over on this thread (if anybody else has shot please donate generously!). Thanks again DariusG!

Anyway ignoring the aspect ratio in the image, I think I’ve got quite a bit closer.

JVC Professional TM-H1950CG

Sony Megatron Simulation

CRT Photo: No idea - ask DariusG!

LCD Photos: OnePlus 8 Pro Camera: Pro Mode, ISO 200, WB 6500K, Aperture Speed 1/60, Auto Focus, 48MPixel JPEG.

5 Likes

Fantastic!

The only things we haven’t seen yet (I think):

-low TVL aperture at 8K (for the 3 people that have an 8K display)

-low dot pitch dotmask (like an 80s computer monitor). I haven’t been able to find ANY reference shots of this, though.

2 Likes

Ok so yes I haven’t implemented 300TVL on an 8K screen - I do need to do that I just couldn’t bring myself to type out the massive slot mask. :rofl:

So I do have a low TVL dot mask but it may not be what you mean? If you select shadow mask and put it down to 300TVL it’ll show it. I’ll post some pics - I think last time you saw it you thought it was too low res.

As always close up photos of the displays would be of help if you have some?

My next major concern apart from yet another reorganisation of shader presets to support SDR out of the box is to focus on the signal side of things - chroma compression, NTSC comb filters and the like.

2 Likes