List of recommended TVs for emulation

***Skip to the bottom of this post for the list. ***

Update 6/6/2019:

Added the LG SM9500 to the list, making it the only non-QLED display to make the cut.

The viewing angle on the SM9500 is superior to the QLED displays because it’s an IPS display, but the blacks, contrast ratio and local dimming are all inferior.

Intro:

Having gone through all the TV and monitor reviews at RTings.com, I’ve come up with the following list of recommended TVs for emulating retro games.

Criteria:

For emulating retro games, you want to match a CRT’s performance as much as possible. To match or exceed a CRT’s performance on a modern display, you need the following:

-next frame input response (<16ms input lag) at 60fps. 
-black frame insertion and/or 120+ Hz refresh rates.
-very high sustained SDR brightness (at least 500 cd/m2)
-black level that is equal to or less than a CRT's (~0.5 cd/m2 is typical)
-contrast ratio close to or greater than a CRT's (~15,000:1)
-wide color gamut 
-good color accuracy

Input lag needs to be as close to zero as possible in order to match the input response of these games on a CRT, which is what they were designed to be played on. Likewise, motion blur needs to be as close to zero as possible in order to match the motion clarity of a CRT. For that, you need 120+Hz and software-based black frame insertion and/or adjustable hardware-based strobing.

For emulating CRT effects such as scanlines and RGB phosphors, you need A LOT of brightness; more than what the vast majority of TVs and monitors are capable of. Scanlines alone cause a 50% reduction in brightness. Black frame insertion/strobing 120Hz for 60fps is another 50% reduction in brightness; BFI @ 240Hz for 60fps is a 75% reduction in brightness. If you add the RGB mask on top of that, that’s another ~50% reduction in brightness. Here are the different possible configurations and the resulting reductions in brightness:

-scanlines + BFI @ 120Hz: 75% reduction
-scanlines + BFI @ 240Hz: 87.5% reduction
-scanlines + RGB mask + BFI @ 120Hz: 87.5% reduction
-scanlines + RGB mask + BFI @ 240Hz: 93.75% reduction

I don’t know the precise brightness reduction of the RGB mask, so I estimated it was about 50%, and for scanlines I’m assuming that they’re 1:1, or a 50% reduction in brightness.

CRTs reached a peak of around 180cd/m2 for the brightest CRTs, but 125cd/m2 is a good target if you’re trying to match an average CRT. To match a CRT’s brightness after applying just scanlines and 120Hz + BFI, you need at least 500 cd/m2 sustained SDR brightness, and even that isn’t as bright as the brightest CRTs.

Very few current displays are capable of reaching the brightness levels required. In fact, there is not a single gaming monitor currently available for less than ~$2,000 that has the required sustained brightness for SDR content. HDR doesn’t help us for non-HDR content and applications, such as emulators, but HDR-capable displays tend to have greater SDR brightness as well. The only TVs that meet the necessary requirements are QLED HDR TVs. There is not a single non-QLED display that is capable of meeting all the necessary performance requirements when it comes to input lag, sustained SDR brightness and refresh rates.

Other important qualities to consider are black level, contrast ratio, and color reproduction. The QLED displays are capable of matching or beating a CRT when it comes to black level and contrast ratio. Additionally, all of the listed displays are wide color gamut displays, so they should be able to display the more deeply saturated reds and greens that NTSC allowed for compared to sRGB, and they should be more than capable of reproducing the limited color palettes of 8 bit and 16 bit games.

Another interesting finding is that none of the LG OLED displays make the cut, as all of the 2018 and 2019 models have input lag >16ms, and/or use aggressive automatic brightness limiters (ABL) that prevent them from sustaining the peak brightness levels required, and the burn-in issue is even more severe than what you saw on plasmas.

If one wants to dig a little deeper, it is looking very likely that QLED will be the OLED killer. If they can figure out how to replace the light filter of LCDs with quantum dots, that would result in a 3x increase in brightness or a 3x increase in efficiency, which is game over for OLED. This would also allow such displays to match a CRT’s brightness when adding scanlines + RGB mask + BFI @ 240Hz, effectively outperforming a CRT in all categories except black level.

OLED might live on in QD-OLED displays, at least until “direct view” QD becomes a reality, which would involve replacing the LCD altogether with QDs that are excited by electricity to directly produce light. This would have all the advantages of OLED with none of the drawbacks (greater brightness, better color, same black level, greater longevity). CRT emulation on such a display would match or exceed the performance of a real CRT. Either way, things are looking increasingly dire for LG, which has no plan B for its OLED displays.

Without further ado, here are the best TVs for emulating retro games while using CRT effects. It’s a very tough choice between these TVs, with excellent options available in every price range.

Recommended TVs for emulation (Updated 6/6/2019):

 LG SM9500
 Samsung Q90/Q90R QLED		
 Samsung Q9FN/Q9/Q9F QLED
 Samsung Q8FN/Q8/Q8F QLED
 Samsung Q7FN/Q7/Q7F QLED
 Samsung Q6FN/Q6/Q6F QLED
 Vizio P Series Quantum	

For more on how QLED works: https://www.cnet.com/news/how-quantum-dots-could-challenge-oled-for-best-tv-picture/

For more on QLED and the future of OLED: https://www.zdnet.com/article/fear-and-trembling-lg-display-faces-the-axe-for-oled-tv-burn-in-and-market-squeeze/

2 Likes

I was just reading that Rtings updated their review of the C9 with input lag of around 13ms at 60hz and 7ms at 120hz after a firmware update.

2 Likes

It does have very good input lag numbers after the firmware update, but it still suffers from low SDR max brightness (136 cd/m2 sustained SDR). Additionally, like all OLEDs, it also has pretty severe burn-in potential (even worse than early plasma TVs).

Before you exaggerate the possibility of burn-in on OLED and also under-emphasize the importance of true black in getting a CRT-like experience, do you actually own one of those said televisions? I broke down those RTings tests and let me tell you they are not very close to real world usage in my opinion. I’ve been using an LG OLED TV since early 2017 and I play a lot of games, movies and also use it for emulation. I use mine as a computer monitor as it’s the centerpiece of my Home Theater PC and I can tell you that I have no image retention issues whatsoever. I know I have an OLED TV so I have my taskbar set to auto hide and my desktop background runs a slideshow. I don’t deliberately leave the TV on with static content for any extended period of time if I can avoid for it but when I’m doing something I’m not worried about image retention at all because it just doesn’t happen to me the way I use my TV.

1 Like

No, I don’t own any of these TVs and am basing my conclusions on the tests performed by RTings. The QLED displays have very good black levels and black uniformity, although not as good as OLED, obviously. It’s also worth pointing out that CRTs didn’t reach true black.

OLED also has the edge when it comes to viewing angles, since there’s no loss of quality when viewing OLED at an angle. Eventually, QD direct view displays will completely solve the viewing angle issue and allow for true black.

Have you run any tests yourself, or do you have another source that contradicts the findings from the RTings tests? In particular, we’re concerned with:

-peak brightness

-input lag

-burn-in potential

Retro games have a lot of static imagery in HUDs, life bars, timers, item and stage information, etc. Furthermore, if you’re using any kind of scanline filter/shader/overlay, you’re basically burning in the scanline pattern by aging the visible pixels faster. You’re risking permanent damage to the display if you use it in this way. There have been reports of burn-in after as little as 4,000 hours; much lower than the minimum of 30,000 hours reported by LG.

Even if burn-in isn’t an issue, you still have inadequate sustained SDR brightness for blur reduction methods and CRT effects. The only way you can overcome the brightness reduction is by increasing SDR brightness to very high levels, and OLED’s aggressive ABL prevents that (in order to avoid burn-in).

In addition to low sustained SDR brightness, you have input lag >16ms on many of the LG OLEDs (according to the tests at RTings).

I’m aware that Retro games have a lot of static imagery and even modern games. I use Analog Shader Pack for my scanlines/CRT emulation. What I’m telling you is that you are exaggerating the effects based on your interpretation of what you’ve researched. The screens don’t just burn-in just like that unless under extreme circumstances. There can be things that a user can overlook that can cause temporary image retention but that is reversible. I’m saying that even under playing retro games, using scanline/filter AND overlays, I am not experiencing any burn-in or even temporary image retention. I have experienced temporary image retention within the first few months of owning my TV while playing Civilization V for hours on end (basically) non-stop besides work and sleeping for days but that image retention quickly went away and as I type on my same screen years after I see no signs of unevenness or anything of the sort. I play games where I spend lots of times in menus e.t.c. for example Nioh, and no retention. If you own an LG OLED TV and you notice image retention, you can run the “Clear Panel Noise” feature and it does what it says on the box. I haven’t had to run that in years and the TV does an automatic panel clean on it’s own every once in a while when you turn it off.

If you analyze the RTings test they said that 2 of the TV’s that played CNN had the static bar clearly burnt-in while another that played Fifa18 had the logo slightly burnt-in. Another which played NBC had no permanent image retention and another that played sports channel also didn’t suffer the problem. Do you watch TV like this 5 on, 1 off, 5 on, 1 off, 5 on, 1 off, 5 on, 1 off? That’s 20 hours of CNN only at (max brightness for 1 of the TVs) for 167 days straight! What about sleep? What about work? I doubt you can look at an OLED TV at maximum brightness for too long without getting a headache. I would think that a normal user wouldn’t notice image retention and not do something about it right away or worse yet keep doing the same thing even after said image retention is noticed. When a normal person runs clear panel noise and the image retention goes away, everything is basically reset to 0 and you have a uniform panel once again. You don’t see image retention taking place and not try to do anything to clear it. That’s another big flaw in that testing. Then look at the screen they use to detect the image retention - a large solid background. Under normal mixed use it’s quite possible for the TV to develop image-retention and then due to varied content the retained image might go away without the user intervening or even ever noticing. I will have to look into the other criteria for the perfect CRT emulating set before I can comment on them.

You said CRT didn’t give true black, a shader/overlay can actually simulate the true colour of a CRT as well. At the end of this all this is all subjective and hands down the best CRT emulation I have experienced is Analog Shader Pack on my OLED TV. I really don’t need anything more than this. It is a very large improvement over the lower contrast and not really black black of my other LCD TV. Plasma is also great for that retro CRT look and feel in my opinion.

Perhaps image retention isn’t an issue for you the way you’re using the TV but it’s still an issue for some and it appears to be worse than the problem on plasma TVs (for the record, my main TV is a plasma), so it’s something worth bearing in mind.

Even if the fear of burn-in is completely exaggerated and a non-issue, the LG OLEDs wouldn’t make the cut for input lag and/or insufficient sustained SDR brightness to compensate for blur reduction + CRT effects.

Good write up, but after reading the stuff you listed; it’s extensive and very in depth. I just think it’s crazy to have a checklist like this when you can get the real thing for free by just looking on FB marketplace.

Again this is talking for emulation only.

1 Like

Can’t beat free! I actually own several CRTs, but all of them need a good cleaning and a recap. Pretty much any CRT you find these days is probably going to need a little bit of work.

The number of considerations that go into trying to match a CRT’s performance in a modern display is definitely somewhat mind-boggling. I did this partially because I wanted to know if future display tech would be suitable for retro gaming and one day match or exceed a CRT’s performance. We’re probably 95% of the way there already with the displays I listed, and direct-view quantum dot displays could eventually get us to 100%, completely eliminating the viewing angle issue and allowing for true black (although it should be noted that QLED is already capable of matching a CRT’s black level, which wasn’t true black).

1 Like

OLED TVs have virtually instantaneous pixel response time and much less motion blur compared to LED/LCD TVs, therefore someone using an OLED TV may not even need to apply any further brightness diminishing blur reduction methods. There’s a video I recently saw where they put one through the Blur Busters UFO test and it easily won over the LED/LCD they were comparing it to. If I find it back, I’ll post it.

They have other advantages as well like viewing angles. OLED is MUCH better when it comes to that. With all other LCD tech you lose quality as you move off center. That’s one of the first things I noticed after I got my OLED TV. I am going to post the video version of an update to the same RTings tests that seem to fuel your fears about burn-in. If you listen carefully to the way the presenter qualifies his statements and the testing methodology you will see that there is really nothing to worry about, if you just use your TV and not put it through a deliberate stress test to see its limits. So my lack of burn-in is not as a result of how I use my TV - I use it normally, whenever I want to do something on the computer, play a game, watch a movie or cable I use it. It’s more about how I don’t use my TV - meaning I don’t do RTings tests or anything that I know will likely damage the TV. If I ever notice image retention, I will run the built-in clear panel noise mitigation that actually does what it says on the box. The RTing presenter even says that their testing is not representative of real world usage and basically admits that they were trying to see how far the TVs could go before they were broken. So interpretation is a hell of a thing. That link you showed about an OLED showing burn-in at a US retailer does not represent real world, normal home use. OLED TVs should not be used as virtual picture frames where the picture remains unchanged nor signs/menu board replacements at restaurants for example. LED/LCD TVs/Monitors are excellent for that.

The reason I’m so passionate about this is that this FUD is being circulated en masse on the global media and it is actually threatening the market for OLED TVs. That could never be good for consumers. Time and time again we’ve witnessed superior technologies end up extinct because of lack of mass market appeal and superior sales of an inferior technology. LG OLED really is the holy grail compared to what we had to contend with a few years ago.

I really implore you to go out and buy one yourself and try it first. You’ve obviously done your homework but I’m saying put away the theories, numbers and books and let your own eyes, ears, fingers, reflexes and brain be the judge. If you can’t buy one go by a friend or family member who has one and setup your system and give it a run for a while, then come back and talk to me. After that, feel free to come and give more qualified advice.

OLED TV is here now! We don’t have to wait for anything to come out. Don’t discourage people from experiencing the ultimate right now.

2 Likes

A fast pixel response time (even 1ms) is NOT sufficient to eliminate sample and hold motion blur and have CRT-like motion clarity. Even with an instantaneous response time, there is still A LOT of blur from sample and hold. To eliminate that, you need software based BFI and/or hardware based strobing, AND sufficient brightness to make up for the brightness reduction that is incurred.

You still have not responded to the fact that the LG OLEDs have input lag >16ms and/or low sustained SDR brightness.

Yes, I acknowledged this in a previous post. However, it’s also worth pointing out that they’ve made great strides in addressing the viewing angle problem and it is much improved on newer LCD TVs and the QLEDs that I listed.

Please understand, I’m not on a personal crusade against OLED, or anything. The LG OLEDs simply don’t have the specs in those performance categories that I’ve identified as being important for CRT emulation.

Now, you say you don’t have any image retention problems, and that’s great. Good for you. However, if you’re regularly having extended gaming sessions playing retro games with scanline filters and HUDs, this is something that you need to be aware of with OLED. There is NO risk of burn in with QLED. Yes, scanline filters will basically burn the scanline pattern into the TV after prolonged use because you’re only aging the visible pixels. A potential solution is to use a pattern that draws the visible line first, and switch it with a pattern that draws the black line first, so that you’re swapping which pixels are getting aged. If you did that and used one filter 50% of the time and another filter 50% of the time, that would solve the pixel aging problem with scanline filters. Again, though, this is a problem that QLED simply doesn’t have at all, so it’s worth mentioning.

Also, if you’re trying to get reliable information about burn-in, do you really think that the manufacturer is going to be the most reliable source? Don’t you think that they might have a vested interest that makes them biased? LG claimed 30,000 hours minimum before burn in, when in reality we’ve seen burn-in on OLEDs after as little as 4,000 hours (see the zdnet article I linked to). It’s a good idea to check your sources for potential bias. In particular, if you’re looking for reliable information about a product, it’s probably a good idea to take whatever information the manufacturer is providing with a grain of salt.

Tell me about it; I own a Plasma TV…

The writing is on the wall for LG OLED; see the zdnet article I posted in the first post. Samsung simply outplayed them; it has little to do with consumer perception of the technology.

Sorry, but this just won’t fly. You’re saying that personal anecdote is superior to scientific testing and that my conclusions based on scientific testing are invalidated until I’ve personally witnessed the thing first-hand (or would somehow be more valid if I saw the thing first hand). Give me a break… You’re basically arguing against science, here.

I have said no less than 3 times now that even if we completely discount the burn-in issue, you still have input lag and low sustained SDR brightness that prevents the LG OLEDs from making the list. You still haven’t responded to that. Come back to me with some scientific tests that show that an OLED has >500 nits sustained SDR brightness and <16ms input lag, and I’ll add it to the list. It’s that simple. Until then, the OLED TVs simply can’t be recommended for CRT emulation. Of course, someone may have a different set of priorities, for which an OLED display would be fine (in the QLED vs OLED article by RTings that you linked to, they didn’t even consider input lag). It seems to me that you feel like you have to defend/rationalize your decision to purchase an OLED, which really isn’t necessary.

Again, I have nothing against OLED. Like many, I thought it would be the thing that finally delivered CRT-like performance in a modern display, but it just isn’t. As of right now, the only displays capable of that are QLED displays, with the exception of the sole LG LED-lit LCD (not OLED) that made the cut.

I think this is a valuable conversation, but let’s try to keep things scientific and let’s try to not get overly defensive about our conclusions. I’m more than happy to add OLED to the list if I see some scientific testing that warrants it.

I really doubt that’s going to happen because it would have shown up already and it doesn’t. I’ve used the same RTings youtube video to debunk your over-exaggeration of the issue of burn-in and also the official response to the issue by the manufacturer. One aspect of science is being able to make observations. I’m constantly observing and I’ve reported based on my observations. Don’t attempt to dismiss my personal experience based on my usage pattern which I have also described. Even the RTings video says that there was no image retention observed due to letter boxing black bars in movies and they suggested that it wouldn’t occur if you watch a lot of 4:3 content. Scanlines would fall into the same category.

You’re getting a bit personal here for someone who’s scientific. I don’t have to rationalize/defend anything actually. I’m just enjoying something and would like to share with an audience that they can also enjoy this as well. I’m extremely satisfied with my purchase. You have recommended certain QLED and an LG LED TV as the best for CRT emulation despite their shortcomings when it comes to black level, viewing angles, motion blur. In my opinion that can’t be a good recommendation. Your view of the burn-in thing seems quite skewed and you’re spreading this view on a public forum thus potentially influencing readers. I saw this and I just thought I’d put things into perspective from an actual user’s veiwpoint. It would be nice if we could get more views from more actual users of the tech.

With some things seeing is believing because you can misinterpret someone else’s results. By getting your own set or going to run your own scientific, CRT emulation tests for yourself on someone else’s sets based on both QLED and also OLED, I am sure you would be able to make a more informed recommendation. I will maintain what I’ve said before:

You’ve added QLED and LED TV’s which aren’t perfect when it comes to CRT Emulation because they all fall short in some way or the other. OLED TVs also fall short according to your scientific criteria. In order to arrive at a truly scientific conclusion as to which is the best you might need to do some double blind controlled testing with users comparing OLED TVs, QLED TVs to an actual CRT and picking which one matches it the closest. Science is not just about coming up with a hypothesis and making a shortlist based on that. You have also have to test and compare. I have experience using CRTs, LED TVs, Plasma TVs and OLED TVs when it comes to CRT emulation and in my experience based on my testing OLED is better than LED when it comes to that. I also prefer Plasma to LED when it comes to CRT emulation as well. Maybe we are comparing apples to oranges here because maybe my idea of CRT emulation using the Analog Shader Pack does not qualify as “true” CRT emulation based on your criteria. As I mentioned before I am completely satisfied with the “CRT emulation” experience no matter how inaccurate it may be using the Analog Shader Pack on my OLED TV.

To me it looks really nice man. Try to see it in person at least once. If you’re interested I don’t mind posting my settings. Even if it may not be 1:1 accurate with a CRT it does what I would want out of a CRT Shader i.e. filter out the jarring jaggies and present my retro games as beautifully as I remembered them back in the day. If I get the chance I’ll try to post a couple screenshots but you’ll only be able to judge them properly if you display them on a display that can show true blacks. Not almost black or very dark grey.

How have I over-exaggerated the issue? By simply stating that it is, in fact, an issue to be aware of?

If you consistently display a scanline pattern for long periods of time, it’s a good idea to at least switch between two scanline patterns (black line first, visible line first) so that the pixels age at roughly the same rate.

This really does look like you’re being defensive and not rational. Where have I attempted to dismiss your personal experience based on the usage pattern that you’ve described? In fact, I’ve done the exact opposite of “dismissing” your personal experience; I’ve acknowledged it multiple times. It seems like you’re dismissing the scientific evidence which contradicts your personal experience. I’ve explained why taking the manufacturer’s word for it on this issue isn’t a great idea.

I’m merely pointing out the possibility that you’re being unnecessarily and irrationally defensive… I don’t see how this attitude is helpful.

The black level on the QLEDs is excellent, as is black uniformity. It just doesn’t have TRUE black like OLED, but then, neither did CRTs.

The viewing angle issue is something that may be negligible for some, and it’s something that is much improved on recent LED-lit LCDs and the QLEDs.

As far as motion blur is concerned, OLED is actually inferior because it doesn’t have the sustained SDR brightness to compensate for motion blur reducing technologies, which result in a heavy brightness penalty. Using software based black frame insertion and/or hardware-based strobing for 60fps @ 120Hz on any of the TVs I listed will result in less motion blur than OLED without using technologies. The OLED TVs can still use software based BFI @ 120Hz, but they don’t have the SDR brightness to be bright enough while doing so with emulators.

Again, how? I’m simply pointing out that it’s an issue to be aware of, and something that can potentially occur, something that is nearly universally acknowledged.

This isn’t one of those things, though. How can I misinterpret input latency numbers and peak SDR brightness numbers? This is about as straightforwardly objective as it gets.

That’s not going to be any more reliable than scientific tests done by someone else or by a group of people. We’re talking about objective things which can be measured, here. The ability to reproduce results is what makes scientific evidence valid. So, if there are some scientific tests out there that contradict the tests done by RTings, then there might be some reason to question those findings. Simply not having performed the tests yourself is NOT sufficient reason to question those findings. That’s not how science works.

The only thing remaining, afaik, is to completely eliminate the viewing angle problem, which of course isn’t an issue at all if you’re facing the display directly. I don’t know about you, but most of the time when I’m gaming, I’m directly in front of the display… Others may give greater priority to this issue.

Black level is inferior to OLED, but OLED also has superior blacks to CRT.

I’ve already discussed motion blur.

How is that any more objective (scientific) than identifying the objective performance capabilities of CRTs and then determining if a TV is capable of matching or exceeding those objective performance capabilities?

That’s probably a big part of it, and I’m more than happy to discuss the criteria I’ve provided. This is about CRT emulation without resorting to any gimmicks such as washing out the scanlines/mask, making the mask/scanlines transparent, clipping contrast or significantly altering the original colors output by the emulator, all of which is done to compensate for the huge loss of brightness that is inherent to these effects.

With the low sustained SDR brightness of OLED, it is simply incapable of compensating for the brightness penalty resulting from BFI, scanlines, and RGB phosphor effects, taken together. It doesn’t even have sufficient SDR brightness to compensate for BFI by itself, using no CRT effects.

With no BFI and/or strobing, you’re not getting anywhere close to CRT-like motion clarity.

With scanline/phosphor effects, you’re either significantly washing them out/making them transparent, such that they no longer look/act like the real thing on a CRT, or else the brightness on OLED isn’t coming anywhere close to that of a CRT.

What is needed for “true” CRT emulation is to replicate the CRT at the micro scale, which involves emulating the RGB phosphor mask and getting the individual emulated phosphors to have the same RGB and luminance values as what you could get on a CRT. That’s going to require even more brightness than the currently brightest QLEDs are capable of; once they figure out how to replace the light filter, then we’ll have “true” CRT emulation.

For now, the smallest scale we can “accurately” emulate is the scale of the scanline. At 1080p and higher resolutions we can match the beam width variation seen at normal viewing distances on a CRT, and with great enough brightness we can match the luminance of the visible lines to what you’d get on a CRT. The final essential ingredient is motion blur reduction through software BFI and/or hardware strobing, which by itself results in an exact halving of brightness with 60fps @ 120Hz. Even if you’re satisfied with less than perfect emulation of a CRT’s scanlines, the OLED doesn’t have the brightness to compensate for BFI @ 120Hz. This alone disqualifies it.

Then, of course, there’s the input lag of OLED displays, most of which have greater than 16ms input lag. The input lag of these TVs may be fine for modern gaming and it may be a non-issue for many people when playing retro games via emulators, but CRTs had zero input lag and that’s the benchmark. This means anything incapable of next-frame response when using emulators (more than 16ms input lag @ 60fps) is automatically disqualified. The only OLED displays that have sufficiently low input lag are the LG C9 OLED and the LG E8 OLED, and neither of those have sufficient sustained SDR brightness for BFI @ 120Hz + CRT effects.

You’ve recommended TVs of today not of the future. None of which are capable of performing this perfect emulation of CRTs that you’re speaking of. You’ve excluded OLED TVs from your list. What I’m saying is that you cannot say what compromises should be acceptable or more important than others when all are trying to become as close as possible to something else today. How could you be the one to decide whether general colour reproduction, contrast or black levels or motion clarity is more important than the other? None of the TVs can do it all at the same time. Therefore it’s up to the individual user to decide what’s more important to them until we are at the point where we can have either perfect CRT emulation or at the point where a particular type of TV Technology betters the others in all of the said criteria.

You emphasize BFI and other motion blur reducing techniques. Did you know that the type of motion blur experienced on LED-lit LCD TVs is not the same as the type of motion blur experienced in OLED TVs? What this means is that there may be some who might find this level and appearance of blur to be more than an acceptable tradeoff and not want to resort to the brightness sapping BFI and stuff. Remember no matter which tech you choose, you have to trade something at least right now.

Here are a few pertinent excerpts from the above review:

“Like all OLED TVs, it delivers outstanding dark room performance, thanks to the perfect inky blacks and perfect black uniformity. It has an outstanding response time, delivering clear motion with no blur trail, but this does cause stutter when watching movies.”

“Perfect deep blacks Extremely low motion blur, and excellent low input lag The image remains accurate when viewed at an angle”

“Update 5/17/2019: We’ve retested the input lag on the same firmware (03.50.31) and found the 4k @ 60Hz + HDR input lag is in the same ballpark as the other resolutions (around 13ms). We don’t know why our previous measurements were higher, as we did confirm them twice. We’ve also updated the 1440p @ 60Hz input lag with this latest firmware. Update 5/2/2019: We’ve retested the input lag of the C9 with the firmware update 03.50.31. The input lag measurements in SDR game and PC modes have decreased. We haven’t retested 1440p @ 60Hz but we will retest this in the future.”

With these TVs the native contrast ratio is further reduced by them trying to improve viewing angles. It’s as if they are becoming more IPS like which has relatively poor native contrast ratios. Technologies like local dimming add artifacts which are not present at all in an OLED TV.

While being very useful, a scientific review can miss things especially when we don’t know how to properly weigh all of our criteria and the reader is not able to fully comprehend all of the data presented. It’s sometimes easier and faster to judge things in person, especially if you know what you’re looking for. If you could just see for yourself then you might better understand what I’m trying to explain to you and the wider audience. The foundation of proper blacks has a positive effect on all colours that the display produces. Try to find a way to see then come back to the forum with your findings. This is imagery and visual arts we’re talking about. Your eyes must be able to play a part in coming to a conclusion. At least some part of this must be subjective. The relatively poor native contrast ratio and the lack of true blacks is distracting in my opinion. The OLED input lag is more than good enough. The worst type of motion artifacts - Ghosting, Coronas / Inverse Ghosting, PWM Artifacts are not present on OLED displays and thus “-black frame insertion and/or 120+ Hz refresh rates. -very high sustained SDR brightness (at least 500 cd/m2)” might not be as important when using an OLED, while being essential when using an LED TV.

You keep saying that CRTs don’t do true black, however CRT black levels are still far superior to LCD Black levels. To me that’s one of the most glaring omissions in your criteria. Add to that the superior native contrast ratio of CRT to LCD. Then you have the response time. In those 3 areas OLED trumps LED and even betters CRT in 2 out of the 3! Something that’s superior in ability should be easier to emulate something that’s inferior now shouldn’t it? The other way around is different. Something that’s inferior in ability cannot be expected to equally emulate something that’s superior.

https://forums.anandtech.com/threads/contrast-ratio-whats-the-typical-contrast-ratio-for-a-crt.26445/

http://www.displaymate.com/crtvslcd.html

I never said that personal anecdote is superior to scientific testing. I was merely trying to show you that my personal observation can also be taken into account and is not non-scientific at all because observation and reporting is one of the foundations of scientific testing. I never invalidated anything, I merely implored you to gather even more information of a different type and source so as to enhance your wealth of information because we are obviously seeing things differently here with regards to the weighting of certain criteria of your “Recommended TVs for emulation” which you also said were “the best TVs for emulating retro games while using CRT effects”. For no OLED TV to be in that list you would have had to have never seen and tested one in person. It really is that good! Have you seen or tested one using CRT emulation in person though?

None of the scientific test you have shown show direct testing of specific CRT emulation on said TVs. Did RTings or any one else test emulation while using CRT emulation on the different types of TVs? Did you? I have and I’ve been comparing and testing which looks, plays and feels better and closer to what I remember from my days of using CRTs for a very long time now.

Nothing the manufacture said about burn-in or temporary image retention seemed to be false in the article that I posted. What they stated seems to match up with Rtings findings, Trusted Reviews findings and also my findings and as well.

Even this guy talks about seeing it in person:

I mean, what’s the harm in that? If all you’ve theorized, and researched so far is accurate, then what’s wrong with trying to see it in person? Aren’t you just a little bit curious to see with your own eyes?

Okay, but this is moving the goalposts, and in a way that doesn’t make a lot of sense, given that I readily acknowledged that no current display is capable of “perfect” CRT emulation. This was never about “perfect,” but about as getting as close as possible to a CRT, using criteria that are objective and scientifically measurable.

All the displays listed are wide color gamut displays that are more than capable of reproducing the colors used by 8 bit and 16 bit games.

Regarding black level and uniformity, OLED is already exceeding a CRT, and QLED is sufficient to match a CRT. This of course results in “infinite” contrast for OLED, even though the peak brightness is lower than many current (non-quantum) LED-lit LCDs.

So really the main advantage of OLED when it comes to CRT emulation is the ability to be viewed at any angle. This has to be weighed against motion clarity, input response time, and overall max brightness.

(emphasis added); it’s also noticeable when playing games that scroll or pan (ie, pretty much all games).

The 2018 and 2019 LG OLEDs use a form of hardware-based strobing that results in stutter. The stuttering can be quite noticeable and distracting with fast-scrolling 2d games; I’ve experienced this first hand. Most hardware-based strobing methods result in some degree of stutter, afaik. In any case, CRTs didn’t have this kind of stutter, so if that’s the benchmark, we want blur reduction that is stutter-free.

You are in denial if you think that OLED is free of sample and hold motion blur or if you think that it has comparable motion clarity to a CRT without using black frame insertion. This level of blur may be acceptable to some, but this isn’t about what some people find acceptable; this is about trying to match the performance of a CRT as closely as possible on a modern display.

In any case, the maximum brightness remains an issue, with a sustained peak SDR brightness of around 300 cd/m2. The way the ABL works on OLED is per-refresh(someone correct me if I’m wrong), so if it’s limited to 300 cd/m2 per refresh and blacking out every other refresh, that’s 150 cd/m2 with blur reduction, which is insufficient for scanlines. Frankly, my 2013 LED-lit LCD is capable of meeting these benchmarks.

How are LCD black levels relevant to QLED, which is a different technology?

QLED black levels can already match or exceed CRT black levels (~0.01 cd/m2 for the best CRTs). OLED is overkill. In fact, the black levels of the top Samsung QLEDs are very close to OLED (~0.005 cd/m2), so any advantage here is negligible (and overkill) when it comes to CRT emulation.

Again, not talking about LCDs, here. You seem to think that QLED is the same technology as LCD. The contrast ratio of QLED is already superior to that of a CRT. OLED’s “infinite contrast” is overkilling it.

Yes, but did you take any actual measurements? Observation without scientific measurements is worthless when it comes to science, and doesn’t qualify as evidence of anything. This isn’t about what “feels right.”

See my above comment.

There are numerous reports of burn-in after fewer hours than the minimum claimed by LG. I already posted this earlier.

What makes you think that I haven’t seen OLED with my eye own eyes? I said that I’ve never owned one, not that I’ve never seen one. This just doesn’t seem like a serious discussion. Yes, it looks incredible. It just doesn’t meet several of the benchmarks set by CRTs. QLED doesn’t meet all of these benchmarks, either, but it meets more of them than OLED. To summarize, when it comes to CRT emulation, the advantages are:

OLED - no image quality loss when viewed at an angle (negligible for gamers who sit directly in front of the display). Infinite contrast and true black (overkill for CRT emulation).

QLED - greater brightness for blur reduction via BFI, greater brightness for CRT effects, lower input latency (next-frame input response at 60fps)

Of the LG OLEDs, only the LG C9 has input lag low enough for next-frame input response while strobing @ 120Hz with BFI, and this has a peak sustained SDR brightness of around 250cd/m2 if you calibrate it to disable ABL.

With OLED you have to choose between inferior blur reduction with stuttering or scanlines; you don’t have the brightness for both software BFI and scanlines without making other compromises, and pretty much any LED-lit LCD is on par in this regard (sustained SDR brightness).

In a scientific test, they specifically said “but this causes stutter when watching movies.” Why are you adding conditions in which they haven’t reported. They never said it’s also noticeable when playing games that scroll or pan (ie, pretty much all games). Are you seeing what you’re writing? Where is the scientific evidence of this new inability to play games? Why does it score a 9.4 in gaming in this test then? You now seem to be just arguing for the sake of arguing. You don’t seem to be interested in trying to understand what I’m trying to say at all. You’ve made some progress but only because I have kept on responding and backing up what I’ve said. I’m still seeing many instances where you’re misinterpreting me and quoting me out of context. It seems like you’re glossing over what I’m saying instead of really reading it and trying to understand. You have some valid points and I’m glad we’ve come to the agreement that no TV does it all and it is ultimately up to the user to decide what criteria they feel weighs more when it comes to achieving as close to CRT image quality as possible using today’s technology.

You reposted the same ZDNet article which is about the same RTings tests which I have linked to in which RTings have qualified the tests and have effectively said that there’s nothing to worry about when it comes to burn-in for the vast majority of people. Man did you even watch the video I posted and listen to it properly? Your harping on this burn-in is being done in a very sensational way.

Finally, I’m not confusing QLED with LCD as QLED uses LCD Technology with an LED backlight and a layer of quantum dots. A QLED TV is another type of LCD TV.

1 Like

It’s something that is inherent to nearly all hardware-based strobing and that I’ve witnessed first hand. For the most part, I understand what blur reduction methods exist and how they work.

The hardware-based blur reduction that it’s using isn’t something that’s going to work differently when playing a game vs watching a movie.

Furthermore, according to the tests at RTings, stutter from frame hold is an area where QLED is superior to OLED.

Nearly everything you say above is a case of projection. Where have I failed to try to understand you? I have attempted to directly respond to every point you’ve made, while you’ve continued to avoid addressing the numerous points I’ve raised. You just haven’t been able to respond to the evidence that’s been presented to you, and you’re getting increasingly belligerent about it, for what reason I’m not sure. Whatever score RTings assigned that TV for gaming is absolutely irrelevant to this discussion, because their reviewers are using a completely different set of criteria.

Please. Either respond with specific examples or drop this nonsense. I’ve quoted you only in order to directly reply to specific points that you’ve made. If you feel I’ve misrepresented you somewhere, by all means show me where I’ve done so, and maybe you can provide clarification. It’s a bit hard to not be annoyed by the insinuation that I’ve done this intentionally.

Stop responding to what you perceive to be the tone of the post or to what you think my intentions are and respond to the actual content/points, and I’ll try to continue to give you the same courtesy. Comments like this aren’t adding anything useful to the discussion and just serve to provoke.

I’ve listed the pros and cons. The individual will need to weigh those pros and cons. Frankly, though- if gaming monitors are any indication, image quality definitely takes a backseat to other performance criteria such as input lag and blur reduction when it comes to the preferences of most gamers.

The OLED has a negligible ability to be viewed at an angle without image quality loss (what gamer doesn’t sit directly in front of their display?) The QLED is superior in every other performance benchmark related to CRT emulation:

OLED - ability to be viewed at any angle

QLED - next-frame input response @60fps with BFI enabled, adequate brightness for both zero motion blur via software BFI + 120Hz AND scanline effects to be used simultaneously

Contrast ratio, black level, and color reproduction are a wash when it comes to emulating a CRT because both QLED and OLED are already beating a CRT in these categories.

If anyone is harping on the burn-in issue, it’s you, and probably in an effort to distract from the other major issues which you thus far have mostly ignored. I’m talking about input latency, peak brightness, and motion clarity, specifically. From the very beginning I said that this is an issue that people need to be aware of. You’re claiming that it’s a non-issue and that people don’t need to consider it at all. You keep citing the RTings tests while also conveniently ignoring the fact that all the OLED TV reviews still list burn-in as a potential concern, ie, something to be aware of, which is all I’ve ever claimed throughout this entire discussion.

Also, “not a problem for the vast majority of people” is of limited applicability here, since we are talking about a fairly specialized use-case, not using these displays in the way that the vast majority of people are using them.

With completely different performance capabilities as a result of having those quantum dots… You’re either being completely disingenuous when you compare the capabilities of the two technologies or you’re just not well-informed of the differences.

I know exactly how QLED works. I also know exactly how OLED works. QLED uses LCD technology and that is a fact. I used the term LCD because it falls under the general category of LCD technologies. There’s nothing grammatically, nor factually wrong nor disingenuous with stating it that way. Could you be anymore condescending? Your list of recommended TVs didn’t only include QLED LCD TVs didn’t it? So by using the general term LCD TV, which all of the TVs fall under I captured the whole set of TVs instead of just the QLED subset. In case you were unsure, QLED is more of a marketing term and to say that it’s using “different technology from LCD” is quite inaccurate. When QLED first came out, Samsung branded it SUHD. Do you remember that? So please allow me to break both down some terms for you. Firstly, LCD stands for Liquid Crystal Display. It uses a Thin Film Transistor layer, which twist to allow different amounts of light to pass through. That is the defining characteristic of all LCD displays including QLED. Don’t let the marketing terms confuse you about the technology. They are all LCD displays. Whether the package contains a backlight that is made up of CCFL bulbs - Cold Cathode Florescent Light bulbs or whether or not the backlight is made up of LEDs - Light Emmitting Diodes. Either edge lit or in the back with Full Array Local Dimming it’s still an LCD TV. Now Quantum Dot LED-Lit LCD TVs just add an additional layer of light sensitive phosphors which glow when the light hits them and they further increase the number of colours and improves the quality and richness of the colours that the TV can display. It’s still an LCD TV. I didn’t use the term LCD in order to try to diminish QLED in any way my good sir. I just didn’t want to keep using the same term over and over again to refer to the same thing when I didn’t have to in that instance. It’s one of the ways writers avoid being repetitive.

I do hope that you will have an enjoyable evening sir. I appreciated the discussion. I hope that I have shared some useful information and that you will continue in your research and quest to isolate TVs which are truly cable of doing what you set out to accomplish and recommend. This is a good thing you’ve started here. Keep up the good work.

In your earlier comparison of OLED vs. LCD, you failed to distinguish between non-QLED LCD and QLED. Lumping QLED into the same category with regular LCDs when comparing QLED’s performance to OLED shows that you either aren’t aware of the significant performance differences between the two, or you’re dishonestly trying to make it seem like QLED has the same limitations as non QLED displays because they both fall under the general umbrella of LCD technology. Knowing how the technology works and being informed of the performance differences between QLED and non-QLED LCD are different things.

This is a misleading statement. The specs alone speak for themselves. There are significant performance improvements compared to the best LED-lit LCDs. That little addition of the quantum dots is a much bigger improvement than you seem to realize. Just compare test results at RTings.com. The high-end QLEDs are very nearly reaching OLED black levels while having MUCH higher sustained brightness levels and lower input latency to boot. It puts them in a whole different category than ordinary LED-lit LCDs. The distinction is 100% justified.

Okay, sure. Just seemed like a false equivalence fallacy to me and/or equivocation, intentional or not.

Some key points: When it comes to sustained SDR brightness, (which is essential for using BFI and CRT effects in emulators) my 6 year old LED-lit LCD is on par with LG OLED displays.

Literally the only advantage OLED has over QLED when it comes to CRT emulation is the superior viewing angle, which is negligible for gamers who sit directly in front of the display.

Well you guys have been having fun. I too dream of the consumer flat panel that totally replaces the need of a good CRT (let alone a high end one). Sadly, we are still far from it. Neither OLED nor QLED will cut it, but right now I’ll side with Nesguy in that QLED is closer (though still not good enough).

I think it’ll be helpful to separate the discussion into (still) picture performance and motion performance.

Still picture:

  • The OLED will handily beat the QLED simply because the former has a much better picture. So, if you play games with little to no fast motion (some arcade games, adventure games, RPGs, strategy games, etc.) then definitely go with an OLED. If you don’t care for scanlines then absolutely go with the OLED (though this falls outside of the “emulating CRTs” parameter). That said, it is true that OLEDs offer blacks like nothing else, including CRT. I used to think my CRT gave me real blacks and then I put an OLED next to it and it’s not even a comparison.

Motion:

  • The OLED sucks. I own a C8 and tried everything available to be happy with it either via emulation or real hardware with an OSSC. Motion sucks, and due to two separate issues that Nesguy has discussed:
  1. “Persistence blur” AKA sample-and-hold blur (distinct from “motion blur” made famous by LCD) on fast moving objects and scenes. The only solution is BFI which has the brightness downside already discussed. It’s a dealbreaker. There is a way to almost eliminate ABL if your panel is calibrated (or you make it think it’s calibrated), but the BFI still saps too much brightness to be satisfying. Forget about frame interpolation as another solution.
  2. Stutter. A lot of LCDs have this problem, some even more than LG OLEDs, some less (a few, not at all) due to varying frame hold times (not to be confused with sample-and-hold) for 60fps content. See: https://www.rtings.com/tv/tests/motion/stutter This affects slow scrolling/panning elements and scenes. Next time you watch a movie look at the credits scrolling on that beautiful inky black and you’ll get it immediately. Sounds like the QLEDs are not that much better than the OLEDs in this regard. Again, people tell you to use frame interpolation to work around this, but that’s no good at all especially for CRT emulation.

So it turns out motion is a big deal in gaming, both retro and modern. If you have the setup to do so, try a 720p or 1080p game on a compatible CRT and after a few minutes you’ll want to both smile and cry. You’ll want a QLED for games with fast motion, but it won’t be like a CRT at all. Even the ideal BFI setting will only get you halfway to the motion clarity of a CRT. It’s pathetic.

Lesson: let’s keep dreaming of the future while we keep putting up with our goddamn CRTs. Otherwise, if you have to have something else right now, get either a QLED or an OLED depending on the types of games you play most. Most likely you’ll want the QLED.

1 Like