Sony Megatron Colour Video Monitor

So my Eve Spectrum has backlight strobing which Blur Busters say reaches Plasma level of motion blur reduction. Not sure where Plasma sets sit in the hierarchy of blurryness display technologies but sounds quite good.

I’ve tested it and it does feel not that far off my CRT albeit much darker. I’ll get some videos to see if I can show what I’m seeing.

2 Likes

Thanks for the heads-up!

I wish I could use your amazing shader right away, but even though I have an HDR TV, Linux does not yet support HDR, unfortunately. But I might start using Retroarch on Windows for this reason alone! I’ll give it a try later.

2 Likes

It works in SDR - just switch it to SDR in the shader parameters. You do need a bright display though. My laptop display (Dell XPS) doesn’t support HDR but has a really bright screen so it can work well on non HDR displays (I’ve got another monitor that is HDR capable but is incredibly dull, so there’s that too)

2 Likes

I have a question for @hunterk @nesguy and anybody else who wants to chip in. Is there any noise transferred between scanlines? So my mental image of how these systems work says there isn’t.

My mental image of how they works is so: when a console converts the frame buffer into a signal it linearly traverses the frame buffer line by converting it into a long analogue string that it then sends down the cable to the CRT. The CRT then reads this signal and adjusts the three beams according to what is sent. Those beams traverse the screen from top to bottom tracing out the analogue signal again as a single long linear signal as if all the pixels/phosphors had been lined up next to each other on a single row.

At no point is there any noise that can be transferred from one scanline to the next or previous scanline. The only time this can happen is if the two scanlines are wide enough/close enough to overlap on the screen itself (but no interference can happen before that point).

Am I right in thinking this?

4 Likes

Yeah, that’s my understanding.

4 Likes

Here are some videos of the Sony Megatron in SDR mode on my Dell XPS laptop:

Using these settings (turn on SDR above):

I think they are plenty bright! :man_shrugging:

3 Likes

I have a plasma and I can attest that it handles motion better than the vast majority of LCDs. Still not as good as a CRT though.

2 Likes

Thank you! I’ll give it a try.

Unfortunately the LG C1 can only reach about 350 cd/m2 in SDR mode.

2 Likes

@MajorPainTheCactus

It would be great to see some side-be-sides with a standard 15kHz arcade monitor, if possible. Medium TVL slotmask is the holy grail for many people.

Acquiring a sample probably wouldn’t be too hard; they’re the most common CRT available.

2 Likes

I’m way ahead of you! :wink: But I only have so much time - boo! I’m back to recreating CRT’s now!

1 Like

I’ve been meaning to do a simulation of this display for quite some time after seeing these amazing pictures. This is a Sony PVM 1910 - a 300TVL display from around '89. I love the chunkyness on the phosphors on this.

First pass, CRT first and Sony Megatron CVM second, sadly my phone isn’t quite as good as taking side on pictures as the person who took the originals - sorry about cropping!

CRT Photos: No idea pictures sourced off the internet

LCD Photos: OnePlus 8 Pro Camera: Pro Mode, ISO 200, WB 6500K, Aperture Speed 1/60, Auto Focus, 48MPixel JPEG.

I’ll put the video’s up tomorrow.

7 Likes

My QN90A made me crazy enough I got a LG C1 OLED in the same room to compare.

HDR: C1 - It has more contrast and the perception is a brighter image. Explosions and brights sparkle in a pleasing way. Colors look very saturated.

90A - It pumps out more light but because of less contrast an how its local dimming works colors don’t resolve well in brights so they look muted. Small torches can often look awful.

Motion: C1 - 120hz with BFI is 95% as good as a CRT on the blurbusters Alien test. 60hz with retroarch BFI is also near CRT quality.

90A - 60hz with BFI the blurbuster Alien test is sharp but ghosted and artificial looking. 120hz doesn’t allow BFI. Retroarch + BFI is 75% as good as a CRT.

I like the 90A a lot but I’m annoyed how I allowed the max brightness numbers to mislead me. It can produce a brighter gray than the C1 but once you add colors and contrast it loses the edge and looks dull in comparison.

4 Likes

Sooooo…how well does the C1 hold up with this shader that seems to be dependent on TVs having specific subpixel structures? We’ve heard all the bad things about the W-OLED displays with their WRGB subpixel structure so I’m curious.

2 Likes

Well first of all you are very blessed with great displays! Both I’m sure are amazing. Hmm ok so it kind of makes sense to me but kind of doesn’t. I’m wondering whether there is a bug in Retroarch in particular the HDR part.

I’m wondering wether you’re able to try different HDR capable software: games, video playback that sort of thing and just prove the same behaviour?

Also have you tried different drivers in Retroarch namely D3D12, D3D11 Vs Vulkan. The directx drivers allow me to send over HDR meta data to the display and I’m wondering whether there is a bug in there and your displays are each behaving well or badly to those values?

It must be pointed out SDR won’t have this issue. I’d expect though the high brightness to be picked up by your eyes more than the darks i.e the human vision system is more attuned to light than dark (I might be wrong in that but I’m pretty sure that’s why we gamma correct - to make more rapid changes in darks that we can percieve).

2 Likes

Or you can wall mount-it and just hide the cables and maybe she might actually think it is a picture.

1 Like

It could just be pupillary light reflex at play normalizing things a bit.

Although OLEDs aren’t potentially as bright as LED displays, they’re bright enough to cause you to squint if viewing bright enough scenes particularly in HDR mode. During normal viewing I never once felt that I needed my OLED TV to be brighter even though it wasn’t quite as bright as my non-HDR certified LED TV.

As a matter of fact my OLED performs better in my living room under bright lighting conditions probably due to the better anti-glare/reflection coating.

Performed better doesn’t necessarily mean that the TV itself was brighter. It is that I could see things easier on the screen and that takes into consideration lights as well as darks. My brain seems to like when I can clearly see the differences between the different shades of black or grey on the default Mega Bezel frame standing out on the carbon fibre background just as much as it likes to be able to resolve the brighter highlights and darker spots of a flame or lenses flare.

So I can probably understand where @BendBombBoom is coming from.

But then again it cold just be a bug or maybe the Samsung QN90A may not have been calibrated well enough out of the box.

This phenomenon between OLEDs vs even the best QDLEDs can be seen in many reviews where there are side by side comparisons. Look up a few and see how the reviewers reacted and concluded. The OLEDs usually had the most impactful image quality.

1 Like

The short answer is just badly as every other shader if CRT accuracy is what you’re after as the white element and quad layout is just a fundamental difference between the two display technologies.

The slightly longer answer is if you aren’t looking for accuracy then it’s a subjective thing - why do I like a piece of artwork and you don’t and vice versa.

1 Like

I’m not disputing this just offering a possible explanation as to @BendBombBoom’s reaction/observations. I anxiously await objective and empirical evidence as to the performance of your Sony Megatron Colour Video Monitor on @BendBombBoom’s C1 even though we already have data from other users’ experiences and enough technical information on the subject that we should all by now know what to expect.

2 Likes

Good idea! :rofl: Only problem is she’ll want to choose the picture.

1 Like

I’d argue it’s not pupillary reflex as that is a purely mechanical thing in dark scenarios your pupil opens to let in more light photons to detect darker objects that are bouncing fewer photons into your eyes and vice versa for bright scenarios.

What I think is instead at play is how the visual system in the brain is filling in/removing detail. What looks better on what display probably depends though on the type of content you’re watching dark Vs light, high frequency Vs low frequency, high contrast Vs low contrast that sort of stuff and of course what happens when motion is involved. Pertinent to this particular issue is probably viewing distant also.

2 Likes