Are Plasma still the best modern choice for retro gaming?

So I really would like to upgrade my main living room display to 4K And get rid of the limitations 1080p imposes as far as not having enough pixels for CRT mask simulation and picking either overscan, underscan or uneven scanlines with RA.

I wanted to ask here rather than a TV forum because this upgrade is mainly for a better retro gaming experience with RA.

So have 4K led TV’s surpassed Plasma technology with this stuff, or should I keep my Plasma due to the true blacks and it’s ability to display pixels better??

Wanted some opinions here.

I believe the latency on modern LCDs is better than old plasmas, also less risk of burn-in.

What about true blacks and motion blur?

Some of the more expensive tech, like QLED and OLED do better blacks than conventional LEDs, and there’s some stuff like localized dimming that can help blacks even with regular LED sets.

Motionblur is still a thing but it’s much better than it used to be. Response times are better and many sets include backlight strobing to combat motion blur even further at the cost of brightness.

If you want true black go with something above LED - the LED Local Dimming (at least on my LG set) is really really bad. I would have gone for OLED, but I heard they can have burn in on certain models.

I don’t know anything about QLED.

Plasma is excellent for emulation. Don’t jump from plasma to LED/LCD that’s a step backward. If you must upgrade then go all out for OLED. You’ll only suffer from temporary image retention (not burn-in) if you do careless things with your TV. Coming from Plasma which can also suffer from image retention, you should already know how to handle those. LED/LCD’s are a bit less finicky and easier to handle but it’s not worth the quality drop compared to plasma and OLED. Take it from someone who’s used all for gaming and emulation and who currently owns and uses both an LED and an OLED display for gaming, entertainment and general use. Ask those who talk about OLED burn-in if they actually own and use one of those sets and see what they tell you. The answer is probably no.

1 Like

The only reason I was considering it is to make the jump to 4k for emulation CRT simulation, and to do that you have no choice but to go LED, but I’ll hold off I guess.

Another downside to my Plasma is the heat. 65 inch Plasma gives off ALOT of heat. Well for now the winter I don’t mind it but when I’m cranking my AC to cool my living room and I’m using the Plasma for extended time…

It’s 2019 and digital displays are still catching up to the CRT. I’ve owned all types of displays and I can say with confidence that no modern digital display has yet to surpass the CRT in terms of objective quality. I bought a 17" CRT PC monitor, nothing fancy, for $10 on Craigslist. Perfect blacks, zero input lag, zero motion blur, complete flexibility with resolutions and refresh rates, and it’s just easier on the eyes than any LCD. To get similar performance from a digital display you would need to spend thousands of dollars on a fancy new OLED display.

The only things that digital displays are better at is taking up less space/weighing less, and having larger screens. The latter is negligible for a gaming display that you’re sitting 2 feet away from.

Obviously, the CRT blows away any other display when it comes to 240p content, but I also prefer them for modern content as well. I would love one of those monster Hi-scan Wegas for 1080p content, if only I had the space. One day I’ll have a game room filled with different CRT displays.

CRT is love, CRT is life.


How is your CRT plugged ? Do you use any HDMI -> VGA converter ? Modern graphic cards don’t output analog signal…

1 Like

That is exactly what I use on my setup lol. If I can figure out how to adapt my retro consoles to VGA, I won’t even need any other monitors. All info I can find though is about adapting to SCART :frowning:

For most monitors, a 15khz signal is unsupported, so you’ll need a line-doubler in the mix. The best/easiest thing is probably to get a retrotink 2x and put a HDMI-to-VGA converter after it. That will give you 31khz with optional scanlines, just like you get from RetroArch.


I get it, but it’s 2019 and we SHOULD have something that should surpass technology that’s decades old. Not mention technology that’s oversized, dangerous to lift, and dangerous to service.

I had to reject a purchase on a 36 inch CRT a few miles from me because for one, it wouldn’t fit in my car and 2 I had no one available to help me carry it because it was over 200lbs. I mean that’s ridiculous.

They are good in terms of what we use them for it but are they practical? Absolutely not, which is why people just throw them away for free. You’re doing them a favor of lifting a large heavy obsolete piece of technology out of their homes.

I use a 17" PC Monitor. It’s easy to move around by myself. Even a 21" monitor would still be easy to move around.

Heck, I used to move around 27" CRT TVs by myself, when I was still into CRT TVs. Nowadays I wouldn’t settle for less than a PC monitor, though. 31kHz ftw.

Yes, we SHOULD have something that should surpass technology that’s decades old, but we DON’T. The only things that digital displays are better at it weighing less, having larger screens, and having perfect geometry. The CRT is superior when it comes to every other aspect of display technology. This is a perfect example of how capitalism frequently fails to maximize utility. We have an economic system that favors cheap crap that can be easily mass produced and sold to an ignorant public that doesn’t know any better.


It’s plugged directly into the VGA port of the machine I’m using. I made sure this machine had a VGA port before buying it. Technically, this machine is outputting via Displayport converted to VGA, but I’m still able to force it to display [email protected], so it works out fine. It’s not “true” 240p, but with the interlacing shader you’d never be able to tell the difference.


The digital display has become a status symbol, much like the car once was, a symbol of having “made it” and of membership in the middle class. The CRT is no longer “sexy.” This, not objective quality, is probably the primary reason why consumers switched to digital displays.


The exact same thing happens in audio, the physics of which was essentially “solved” back in like the 50s. Unfortunately, hifi gear from that era is hideous, gigantic and expensive to produce, requiring skilled and specialized craftsmanship and sensitive testing and instrumentation (sound familiar?).

Ever since then, the technology advances have been focused on minimizing compromises to the sound while making the equipment smaller, lighter, louder and more attractive, while costing less to produce at a massive scale.

There’s a reason audiophiles deal in vinyl, tube amps that output <10 watts and speaker pairs that were built in the 50s-60s and take up an entire room. If you’re willing to give up on all compromises and strive purely for the best at any cost, you can achieve an experience that borders on transcendence.


Sort of on this topic:

I’ve been thinking lately that the hi-scan Wegas might be severely under-rated as retro gaming TVs. Sure, hooking up real consoles will result in a linedoubled image (gross), but it seems like you could hook up an emulation PC via HDMI and then use the interlacing shader for scanlines, and get a perfect scanlined “240p” picture, right? With the ability to support up to 1080i, you’d have a perfect TV for all eras of content including modern stuff. Am I missing something? Could the hi-scan Wega be the one CRT to rule them all?

1 Like

Yes, they’re awesome. I’ve never gotten to mess around with one in that context, but I don’t see any reason why it wouldn’t work as you describe.

A friend of mine had a 32" 16:9 Wega for a long time, as it is/was the absolute best display money can buy for Street Fighter 4 and Super Street Fighter 2 Turbo: HD Remix on Xbox 360. It weighed >200 lbs, obviously, but it was incredible.


Ah, I knew there would be a catch: it looks like the Hi-scan Wegas upscale everything to either 1080i or 540p, which adds 1-2 frames of input lag.

So, for 240p stuff, if you want zero input lag you’re faced with the same problems as a digital display: you output at 1920x1080 and either have letterboxing (4x vertical scale), a cropped image (5x vertical scale), or scaling artifacts from non-integer scaling and messed up scanlines. 5x scale would also result in weird-looking scanlines with the interlacing shader.

One should be able to eliminate the letterboxing at 4x scale by using the vertical size adjustment in the service menu, but you’d have to resize it when you switched to any standard (non-letterboxed) 1080p/i source. Would this be the best solution?

From what I’ve seen online, only a few HD CRTs (Samsungs?) will upscale 480p to 540p, which is obviously bad news. The rest just underscan, which isn’t great, but it’s better than scaling. IIRC, there were some 100 Hz TVs that will add latency as they monkey with the refresh rate.

1 Like