List of recommended TVs for emulation

A fast pixel response time (even 1ms) is NOT sufficient to eliminate sample and hold motion blur and have CRT-like motion clarity. Even with an instantaneous response time, there is still A LOT of blur from sample and hold. To eliminate that, you need software based BFI and/or hardware based strobing, AND sufficient brightness to make up for the brightness reduction that is incurred.

You still have not responded to the fact that the LG OLEDs have input lag >16ms and/or low sustained SDR brightness.

Yes, I acknowledged this in a previous post. However, it’s also worth pointing out that they’ve made great strides in addressing the viewing angle problem and it is much improved on newer LCD TVs and the QLEDs that I listed.

Please understand, I’m not on a personal crusade against OLED, or anything. The LG OLEDs simply don’t have the specs in those performance categories that I’ve identified as being important for CRT emulation.

Now, you say you don’t have any image retention problems, and that’s great. Good for you. However, if you’re regularly having extended gaming sessions playing retro games with scanline filters and HUDs, this is something that you need to be aware of with OLED. There is NO risk of burn in with QLED. Yes, scanline filters will basically burn the scanline pattern into the TV after prolonged use because you’re only aging the visible pixels. A potential solution is to use a pattern that draws the visible line first, and switch it with a pattern that draws the black line first, so that you’re swapping which pixels are getting aged. If you did that and used one filter 50% of the time and another filter 50% of the time, that would solve the pixel aging problem with scanline filters. Again, though, this is a problem that QLED simply doesn’t have at all, so it’s worth mentioning.

Also, if you’re trying to get reliable information about burn-in, do you really think that the manufacturer is going to be the most reliable source? Don’t you think that they might have a vested interest that makes them biased? LG claimed 30,000 hours minimum before burn in, when in reality we’ve seen burn-in on OLEDs after as little as 4,000 hours (see the zdnet article I linked to). It’s a good idea to check your sources for potential bias. In particular, if you’re looking for reliable information about a product, it’s probably a good idea to take whatever information the manufacturer is providing with a grain of salt.

Tell me about it; I own a Plasma TV…

The writing is on the wall for LG OLED; see the zdnet article I posted in the first post. Samsung simply outplayed them; it has little to do with consumer perception of the technology.

Sorry, but this just won’t fly. You’re saying that personal anecdote is superior to scientific testing and that my conclusions based on scientific testing are invalidated until I’ve personally witnessed the thing first-hand (or would somehow be more valid if I saw the thing first hand). Give me a break… You’re basically arguing against science, here.

I have said no less than 3 times now that even if we completely discount the burn-in issue, you still have input lag and low sustained SDR brightness that prevents the LG OLEDs from making the list. You still haven’t responded to that. Come back to me with some scientific tests that show that an OLED has >500 nits sustained SDR brightness and <16ms input lag, and I’ll add it to the list. It’s that simple. Until then, the OLED TVs simply can’t be recommended for CRT emulation. Of course, someone may have a different set of priorities, for which an OLED display would be fine (in the QLED vs OLED article by RTings that you linked to, they didn’t even consider input lag). It seems to me that you feel like you have to defend/rationalize your decision to purchase an OLED, which really isn’t necessary.

Again, I have nothing against OLED. Like many, I thought it would be the thing that finally delivered CRT-like performance in a modern display, but it just isn’t. As of right now, the only displays capable of that are QLED displays, with the exception of the sole LG LED-lit LCD (not OLED) that made the cut.

I think this is a valuable conversation, but let’s try to keep things scientific and let’s try to not get overly defensive about our conclusions. I’m more than happy to add OLED to the list if I see some scientific testing that warrants it.

I really doubt that’s going to happen because it would have shown up already and it doesn’t. I’ve used the same RTings youtube video to debunk your over-exaggeration of the issue of burn-in and also the official response to the issue by the manufacturer. One aspect of science is being able to make observations. I’m constantly observing and I’ve reported based on my observations. Don’t attempt to dismiss my personal experience based on my usage pattern which I have also described. Even the RTings video says that there was no image retention observed due to letter boxing black bars in movies and they suggested that it wouldn’t occur if you watch a lot of 4:3 content. Scanlines would fall into the same category.

You’re getting a bit personal here for someone who’s scientific. I don’t have to rationalize/defend anything actually. I’m just enjoying something and would like to share with an audience that they can also enjoy this as well. I’m extremely satisfied with my purchase. You have recommended certain QLED and an LG LED TV as the best for CRT emulation despite their shortcomings when it comes to black level, viewing angles, motion blur. In my opinion that can’t be a good recommendation. Your view of the burn-in thing seems quite skewed and you’re spreading this view on a public forum thus potentially influencing readers. I saw this and I just thought I’d put things into perspective from an actual user’s veiwpoint. It would be nice if we could get more views from more actual users of the tech.

With some things seeing is believing because you can misinterpret someone else’s results. By getting your own set or going to run your own scientific, CRT emulation tests for yourself on someone else’s sets based on both QLED and also OLED, I am sure you would be able to make a more informed recommendation. I will maintain what I’ve said before:

You’ve added QLED and LED TV’s which aren’t perfect when it comes to CRT Emulation because they all fall short in some way or the other. OLED TVs also fall short according to your scientific criteria. In order to arrive at a truly scientific conclusion as to which is the best you might need to do some double blind controlled testing with users comparing OLED TVs, QLED TVs to an actual CRT and picking which one matches it the closest. Science is not just about coming up with a hypothesis and making a shortlist based on that. You have also have to test and compare. I have experience using CRTs, LED TVs, Plasma TVs and OLED TVs when it comes to CRT emulation and in my experience based on my testing OLED is better than LED when it comes to that. I also prefer Plasma to LED when it comes to CRT emulation as well. Maybe we are comparing apples to oranges here because maybe my idea of CRT emulation using the Analog Shader Pack does not qualify as “true” CRT emulation based on your criteria. As I mentioned before I am completely satisfied with the “CRT emulation” experience no matter how inaccurate it may be using the Analog Shader Pack on my OLED TV.

To me it looks really nice man. Try to see it in person at least once. If you’re interested I don’t mind posting my settings. Even if it may not be 1:1 accurate with a CRT it does what I would want out of a CRT Shader i.e. filter out the jarring jaggies and present my retro games as beautifully as I remembered them back in the day. If I get the chance I’ll try to post a couple screenshots but you’ll only be able to judge them properly if you display them on a display that can show true blacks. Not almost black or very dark grey.

How have I over-exaggerated the issue? By simply stating that it is, in fact, an issue to be aware of?

If you consistently display a scanline pattern for long periods of time, it’s a good idea to at least switch between two scanline patterns (black line first, visible line first) so that the pixels age at roughly the same rate.

This really does look like you’re being defensive and not rational. Where have I attempted to dismiss your personal experience based on the usage pattern that you’ve described? In fact, I’ve done the exact opposite of “dismissing” your personal experience; I’ve acknowledged it multiple times. It seems like you’re dismissing the scientific evidence which contradicts your personal experience. I’ve explained why taking the manufacturer’s word for it on this issue isn’t a great idea.

I’m merely pointing out the possibility that you’re being unnecessarily and irrationally defensive… I don’t see how this attitude is helpful.

The black level on the QLEDs is excellent, as is black uniformity. It just doesn’t have TRUE black like OLED, but then, neither did CRTs.

The viewing angle issue is something that may be negligible for some, and it’s something that is much improved on recent LED-lit LCDs and the QLEDs.

As far as motion blur is concerned, OLED is actually inferior because it doesn’t have the sustained SDR brightness to compensate for motion blur reducing technologies, which result in a heavy brightness penalty. Using software based black frame insertion and/or hardware-based strobing for 60fps @ 120Hz on any of the TVs I listed will result in less motion blur than OLED without using technologies. The OLED TVs can still use software based BFI @ 120Hz, but they don’t have the SDR brightness to be bright enough while doing so with emulators.

Again, how? I’m simply pointing out that it’s an issue to be aware of, and something that can potentially occur, something that is nearly universally acknowledged.

This isn’t one of those things, though. How can I misinterpret input latency numbers and peak SDR brightness numbers? This is about as straightforwardly objective as it gets.

That’s not going to be any more reliable than scientific tests done by someone else or by a group of people. We’re talking about objective things which can be measured, here. The ability to reproduce results is what makes scientific evidence valid. So, if there are some scientific tests out there that contradict the tests done by RTings, then there might be some reason to question those findings. Simply not having performed the tests yourself is NOT sufficient reason to question those findings. That’s not how science works.

The only thing remaining, afaik, is to completely eliminate the viewing angle problem, which of course isn’t an issue at all if you’re facing the display directly. I don’t know about you, but most of the time when I’m gaming, I’m directly in front of the display… Others may give greater priority to this issue.

Black level is inferior to OLED, but OLED also has superior blacks to CRT.

I’ve already discussed motion blur.

How is that any more objective (scientific) than identifying the objective performance capabilities of CRTs and then determining if a TV is capable of matching or exceeding those objective performance capabilities?

That’s probably a big part of it, and I’m more than happy to discuss the criteria I’ve provided. This is about CRT emulation without resorting to any gimmicks such as washing out the scanlines/mask, making the mask/scanlines transparent, clipping contrast or significantly altering the original colors output by the emulator, all of which is done to compensate for the huge loss of brightness that is inherent to these effects.

With the low sustained SDR brightness of OLED, it is simply incapable of compensating for the brightness penalty resulting from BFI, scanlines, and RGB phosphor effects, taken together. It doesn’t even have sufficient SDR brightness to compensate for BFI by itself, using no CRT effects.

With no BFI and/or strobing, you’re not getting anywhere close to CRT-like motion clarity.

With scanline/phosphor effects, you’re either significantly washing them out/making them transparent, such that they no longer look/act like the real thing on a CRT, or else the brightness on OLED isn’t coming anywhere close to that of a CRT.

What is needed for “true” CRT emulation is to replicate the CRT at the micro scale, which involves emulating the RGB phosphor mask and getting the individual emulated phosphors to have the same RGB and luminance values as what you could get on a CRT. That’s going to require even more brightness than the currently brightest QLEDs are capable of; once they figure out how to replace the light filter, then we’ll have “true” CRT emulation.

For now, the smallest scale we can “accurately” emulate is the scale of the scanline. At 1080p and higher resolutions we can match the beam width variation seen at normal viewing distances on a CRT, and with great enough brightness we can match the luminance of the visible lines to what you’d get on a CRT. The final essential ingredient is motion blur reduction through software BFI and/or hardware strobing, which by itself results in an exact halving of brightness with 60fps @ 120Hz. Even if you’re satisfied with less than perfect emulation of a CRT’s scanlines, the OLED doesn’t have the brightness to compensate for BFI @ 120Hz. This alone disqualifies it.

Then, of course, there’s the input lag of OLED displays, most of which have greater than 16ms input lag. The input lag of these TVs may be fine for modern gaming and it may be a non-issue for many people when playing retro games via emulators, but CRTs had zero input lag and that’s the benchmark. This means anything incapable of next-frame response when using emulators (more than 16ms input lag @ 60fps) is automatically disqualified. The only OLED displays that have sufficiently low input lag are the LG C9 OLED and the LG E8 OLED, and neither of those have sufficient sustained SDR brightness for BFI @ 120Hz + CRT effects.

You’ve recommended TVs of today not of the future. None of which are capable of performing this perfect emulation of CRTs that you’re speaking of. You’ve excluded OLED TVs from your list. What I’m saying is that you cannot say what compromises should be acceptable or more important than others when all are trying to become as close as possible to something else today. How could you be the one to decide whether general colour reproduction, contrast or black levels or motion clarity is more important than the other? None of the TVs can do it all at the same time. Therefore it’s up to the individual user to decide what’s more important to them until we are at the point where we can have either perfect CRT emulation or at the point where a particular type of TV Technology betters the others in all of the said criteria.

You emphasize BFI and other motion blur reducing techniques. Did you know that the type of motion blur experienced on LED-lit LCD TVs is not the same as the type of motion blur experienced in OLED TVs? What this means is that there may be some who might find this level and appearance of blur to be more than an acceptable tradeoff and not want to resort to the brightness sapping BFI and stuff. Remember no matter which tech you choose, you have to trade something at least right now.

Here are a few pertinent excerpts from the above review:

“Like all OLED TVs, it delivers outstanding dark room performance, thanks to the perfect inky blacks and perfect black uniformity. It has an outstanding response time, delivering clear motion with no blur trail, but this does cause stutter when watching movies.”

“Perfect deep blacks Extremely low motion blur, and excellent low input lag The image remains accurate when viewed at an angle”

“Update 5/17/2019: We’ve retested the input lag on the same firmware (03.50.31) and found the 4k @ 60Hz + HDR input lag is in the same ballpark as the other resolutions (around 13ms). We don’t know why our previous measurements were higher, as we did confirm them twice. We’ve also updated the 1440p @ 60Hz input lag with this latest firmware. Update 5/2/2019: We’ve retested the input lag of the C9 with the firmware update 03.50.31. The input lag measurements in SDR game and PC modes have decreased. We haven’t retested 1440p @ 60Hz but we will retest this in the future.”

With these TVs the native contrast ratio is further reduced by them trying to improve viewing angles. It’s as if they are becoming more IPS like which has relatively poor native contrast ratios. Technologies like local dimming add artifacts which are not present at all in an OLED TV.

While being very useful, a scientific review can miss things especially when we don’t know how to properly weigh all of our criteria and the reader is not able to fully comprehend all of the data presented. It’s sometimes easier and faster to judge things in person, especially if you know what you’re looking for. If you could just see for yourself then you might better understand what I’m trying to explain to you and the wider audience. The foundation of proper blacks has a positive effect on all colours that the display produces. Try to find a way to see then come back to the forum with your findings. This is imagery and visual arts we’re talking about. Your eyes must be able to play a part in coming to a conclusion. At least some part of this must be subjective. The relatively poor native contrast ratio and the lack of true blacks is distracting in my opinion. The OLED input lag is more than good enough. The worst type of motion artifacts - Ghosting, Coronas / Inverse Ghosting, PWM Artifacts are not present on OLED displays and thus “-black frame insertion and/or 120+ Hz refresh rates. -very high sustained SDR brightness (at least 500 cd/m2)” might not be as important when using an OLED, while being essential when using an LED TV.

You keep saying that CRTs don’t do true black, however CRT black levels are still far superior to LCD Black levels. To me that’s one of the most glaring omissions in your criteria. Add to that the superior native contrast ratio of CRT to LCD. Then you have the response time. In those 3 areas OLED trumps LED and even betters CRT in 2 out of the 3! Something that’s superior in ability should be easier to emulate something that’s inferior now shouldn’t it? The other way around is different. Something that’s inferior in ability cannot be expected to equally emulate something that’s superior.

https://forums.anandtech.com/threads/contrast-ratio-whats-the-typical-contrast-ratio-for-a-crt.26445/

http://www.displaymate.com/crtvslcd.html

I never said that personal anecdote is superior to scientific testing. I was merely trying to show you that my personal observation can also be taken into account and is not non-scientific at all because observation and reporting is one of the foundations of scientific testing. I never invalidated anything, I merely implored you to gather even more information of a different type and source so as to enhance your wealth of information because we are obviously seeing things differently here with regards to the weighting of certain criteria of your “Recommended TVs for emulation” which you also said were “the best TVs for emulating retro games while using CRT effects”. For no OLED TV to be in that list you would have had to have never seen and tested one in person. It really is that good! Have you seen or tested one using CRT emulation in person though?

None of the scientific test you have shown show direct testing of specific CRT emulation on said TVs. Did RTings or any one else test emulation while using CRT emulation on the different types of TVs? Did you? I have and I’ve been comparing and testing which looks, plays and feels better and closer to what I remember from my days of using CRTs for a very long time now.

Nothing the manufacture said about burn-in or temporary image retention seemed to be false in the article that I posted. What they stated seems to match up with Rtings findings, Trusted Reviews findings and also my findings and as well.

Even this guy talks about seeing it in person:

I mean, what’s the harm in that? If all you’ve theorized, and researched so far is accurate, then what’s wrong with trying to see it in person? Aren’t you just a little bit curious to see with your own eyes?

Okay, but this is moving the goalposts, and in a way that doesn’t make a lot of sense, given that I readily acknowledged that no current display is capable of “perfect” CRT emulation. This was never about “perfect,” but about as getting as close as possible to a CRT, using criteria that are objective and scientifically measurable.

All the displays listed are wide color gamut displays that are more than capable of reproducing the colors used by 8 bit and 16 bit games.

Regarding black level and uniformity, OLED is already exceeding a CRT, and QLED is sufficient to match a CRT. This of course results in “infinite” contrast for OLED, even though the peak brightness is lower than many current (non-quantum) LED-lit LCDs.

So really the main advantage of OLED when it comes to CRT emulation is the ability to be viewed at any angle. This has to be weighed against motion clarity, input response time, and overall max brightness.

(emphasis added); it’s also noticeable when playing games that scroll or pan (ie, pretty much all games).

The 2018 and 2019 LG OLEDs use a form of hardware-based strobing that results in stutter. The stuttering can be quite noticeable and distracting with fast-scrolling 2d games; I’ve experienced this first hand. Most hardware-based strobing methods result in some degree of stutter, afaik. In any case, CRTs didn’t have this kind of stutter, so if that’s the benchmark, we want blur reduction that is stutter-free.

You are in denial if you think that OLED is free of sample and hold motion blur or if you think that it has comparable motion clarity to a CRT without using black frame insertion. This level of blur may be acceptable to some, but this isn’t about what some people find acceptable; this is about trying to match the performance of a CRT as closely as possible on a modern display.

In any case, the maximum brightness remains an issue, with a sustained peak SDR brightness of around 300 cd/m2. The way the ABL works on OLED is per-refresh(someone correct me if I’m wrong), so if it’s limited to 300 cd/m2 per refresh and blacking out every other refresh, that’s 150 cd/m2 with blur reduction, which is insufficient for scanlines. Frankly, my 2013 LED-lit LCD is capable of meeting these benchmarks.

How are LCD black levels relevant to QLED, which is a different technology?

QLED black levels can already match or exceed CRT black levels (~0.01 cd/m2 for the best CRTs). OLED is overkill. In fact, the black levels of the top Samsung QLEDs are very close to OLED (~0.005 cd/m2), so any advantage here is negligible (and overkill) when it comes to CRT emulation.

Again, not talking about LCDs, here. You seem to think that QLED is the same technology as LCD. The contrast ratio of QLED is already superior to that of a CRT. OLED’s “infinite contrast” is overkilling it.

Yes, but did you take any actual measurements? Observation without scientific measurements is worthless when it comes to science, and doesn’t qualify as evidence of anything. This isn’t about what “feels right.”

See my above comment.

There are numerous reports of burn-in after fewer hours than the minimum claimed by LG. I already posted this earlier.

What makes you think that I haven’t seen OLED with my eye own eyes? I said that I’ve never owned one, not that I’ve never seen one. This just doesn’t seem like a serious discussion. Yes, it looks incredible. It just doesn’t meet several of the benchmarks set by CRTs. QLED doesn’t meet all of these benchmarks, either, but it meets more of them than OLED. To summarize, when it comes to CRT emulation, the advantages are:

OLED - no image quality loss when viewed at an angle (negligible for gamers who sit directly in front of the display). Infinite contrast and true black (overkill for CRT emulation).

QLED - greater brightness for blur reduction via BFI, greater brightness for CRT effects, lower input latency (next-frame input response at 60fps)

Of the LG OLEDs, only the LG C9 has input lag low enough for next-frame input response while strobing @ 120Hz with BFI, and this has a peak sustained SDR brightness of around 250cd/m2 if you calibrate it to disable ABL.

With OLED you have to choose between inferior blur reduction with stuttering or scanlines; you don’t have the brightness for both software BFI and scanlines without making other compromises, and pretty much any LED-lit LCD is on par in this regard (sustained SDR brightness).

In a scientific test, they specifically said “but this causes stutter when watching movies.” Why are you adding conditions in which they haven’t reported. They never said it’s also noticeable when playing games that scroll or pan (ie, pretty much all games). Are you seeing what you’re writing? Where is the scientific evidence of this new inability to play games? Why does it score a 9.4 in gaming in this test then? You now seem to be just arguing for the sake of arguing. You don’t seem to be interested in trying to understand what I’m trying to say at all. You’ve made some progress but only because I have kept on responding and backing up what I’ve said. I’m still seeing many instances where you’re misinterpreting me and quoting me out of context. It seems like you’re glossing over what I’m saying instead of really reading it and trying to understand. You have some valid points and I’m glad we’ve come to the agreement that no TV does it all and it is ultimately up to the user to decide what criteria they feel weighs more when it comes to achieving as close to CRT image quality as possible using today’s technology.

You reposted the same ZDNet article which is about the same RTings tests which I have linked to in which RTings have qualified the tests and have effectively said that there’s nothing to worry about when it comes to burn-in for the vast majority of people. Man did you even watch the video I posted and listen to it properly? Your harping on this burn-in is being done in a very sensational way.

Finally, I’m not confusing QLED with LCD as QLED uses LCD Technology with an LED backlight and a layer of quantum dots. A QLED TV is another type of LCD TV.

1 Like

It’s something that is inherent to nearly all hardware-based strobing and that I’ve witnessed first hand. For the most part, I understand what blur reduction methods exist and how they work.

The hardware-based blur reduction that it’s using isn’t something that’s going to work differently when playing a game vs watching a movie.

Furthermore, according to the tests at RTings, stutter from frame hold is an area where QLED is superior to OLED.

Nearly everything you say above is a case of projection. Where have I failed to try to understand you? I have attempted to directly respond to every point you’ve made, while you’ve continued to avoid addressing the numerous points I’ve raised. You just haven’t been able to respond to the evidence that’s been presented to you, and you’re getting increasingly belligerent about it, for what reason I’m not sure. Whatever score RTings assigned that TV for gaming is absolutely irrelevant to this discussion, because their reviewers are using a completely different set of criteria.

Please. Either respond with specific examples or drop this nonsense. I’ve quoted you only in order to directly reply to specific points that you’ve made. If you feel I’ve misrepresented you somewhere, by all means show me where I’ve done so, and maybe you can provide clarification. It’s a bit hard to not be annoyed by the insinuation that I’ve done this intentionally.

Stop responding to what you perceive to be the tone of the post or to what you think my intentions are and respond to the actual content/points, and I’ll try to continue to give you the same courtesy. Comments like this aren’t adding anything useful to the discussion and just serve to provoke.

I’ve listed the pros and cons. The individual will need to weigh those pros and cons. Frankly, though- if gaming monitors are any indication, image quality definitely takes a backseat to other performance criteria such as input lag and blur reduction when it comes to the preferences of most gamers.

The OLED has a negligible ability to be viewed at an angle without image quality loss (what gamer doesn’t sit directly in front of their display?) The QLED is superior in every other performance benchmark related to CRT emulation:

OLED - ability to be viewed at any angle

QLED - next-frame input response @60fps with BFI enabled, adequate brightness for both zero motion blur via software BFI + 120Hz AND scanline effects to be used simultaneously

Contrast ratio, black level, and color reproduction are a wash when it comes to emulating a CRT because both QLED and OLED are already beating a CRT in these categories.

If anyone is harping on the burn-in issue, it’s you, and probably in an effort to distract from the other major issues which you thus far have mostly ignored. I’m talking about input latency, peak brightness, and motion clarity, specifically. From the very beginning I said that this is an issue that people need to be aware of. You’re claiming that it’s a non-issue and that people don’t need to consider it at all. You keep citing the RTings tests while also conveniently ignoring the fact that all the OLED TV reviews still list burn-in as a potential concern, ie, something to be aware of, which is all I’ve ever claimed throughout this entire discussion.

Also, “not a problem for the vast majority of people” is of limited applicability here, since we are talking about a fairly specialized use-case, not using these displays in the way that the vast majority of people are using them.

With completely different performance capabilities as a result of having those quantum dots… You’re either being completely disingenuous when you compare the capabilities of the two technologies or you’re just not well-informed of the differences.

I know exactly how QLED works. I also know exactly how OLED works. QLED uses LCD technology and that is a fact. I used the term LCD because it falls under the general category of LCD technologies. There’s nothing grammatically, nor factually wrong nor disingenuous with stating it that way. Could you be anymore condescending? Your list of recommended TVs didn’t only include QLED LCD TVs didn’t it? So by using the general term LCD TV, which all of the TVs fall under I captured the whole set of TVs instead of just the QLED subset. In case you were unsure, QLED is more of a marketing term and to say that it’s using “different technology from LCD” is quite inaccurate. When QLED first came out, Samsung branded it SUHD. Do you remember that? So please allow me to break both down some terms for you. Firstly, LCD stands for Liquid Crystal Display. It uses a Thin Film Transistor layer, which twist to allow different amounts of light to pass through. That is the defining characteristic of all LCD displays including QLED. Don’t let the marketing terms confuse you about the technology. They are all LCD displays. Whether the package contains a backlight that is made up of CCFL bulbs - Cold Cathode Florescent Light bulbs or whether or not the backlight is made up of LEDs - Light Emmitting Diodes. Either edge lit or in the back with Full Array Local Dimming it’s still an LCD TV. Now Quantum Dot LED-Lit LCD TVs just add an additional layer of light sensitive phosphors which glow when the light hits them and they further increase the number of colours and improves the quality and richness of the colours that the TV can display. It’s still an LCD TV. I didn’t use the term LCD in order to try to diminish QLED in any way my good sir. I just didn’t want to keep using the same term over and over again to refer to the same thing when I didn’t have to in that instance. It’s one of the ways writers avoid being repetitive.

I do hope that you will have an enjoyable evening sir. I appreciated the discussion. I hope that I have shared some useful information and that you will continue in your research and quest to isolate TVs which are truly cable of doing what you set out to accomplish and recommend. This is a good thing you’ve started here. Keep up the good work.

In your earlier comparison of OLED vs. LCD, you failed to distinguish between non-QLED LCD and QLED. Lumping QLED into the same category with regular LCDs when comparing QLED’s performance to OLED shows that you either aren’t aware of the significant performance differences between the two, or you’re dishonestly trying to make it seem like QLED has the same limitations as non QLED displays because they both fall under the general umbrella of LCD technology. Knowing how the technology works and being informed of the performance differences between QLED and non-QLED LCD are different things.

This is a misleading statement. The specs alone speak for themselves. There are significant performance improvements compared to the best LED-lit LCDs. That little addition of the quantum dots is a much bigger improvement than you seem to realize. Just compare test results at RTings.com. The high-end QLEDs are very nearly reaching OLED black levels while having MUCH higher sustained brightness levels and lower input latency to boot. It puts them in a whole different category than ordinary LED-lit LCDs. The distinction is 100% justified.

Okay, sure. Just seemed like a false equivalence fallacy to me and/or equivocation, intentional or not.

Some key points: When it comes to sustained SDR brightness, (which is essential for using BFI and CRT effects in emulators) my 6 year old LED-lit LCD is on par with LG OLED displays.

Literally the only advantage OLED has over QLED when it comes to CRT emulation is the superior viewing angle, which is negligible for gamers who sit directly in front of the display.

Well you guys have been having fun. I too dream of the consumer flat panel that totally replaces the need of a good CRT (let alone a high end one). Sadly, we are still far from it. Neither OLED nor QLED will cut it, but right now I’ll side with Nesguy in that QLED is closer (though still not good enough).

I think it’ll be helpful to separate the discussion into (still) picture performance and motion performance.

Still picture:

  • The OLED will handily beat the QLED simply because the former has a much better picture. So, if you play games with little to no fast motion (some arcade games, adventure games, RPGs, strategy games, etc.) then definitely go with an OLED. If you don’t care for scanlines then absolutely go with the OLED (though this falls outside of the “emulating CRTs” parameter). That said, it is true that OLEDs offer blacks like nothing else, including CRT. I used to think my CRT gave me real blacks and then I put an OLED next to it and it’s not even a comparison.

Motion:

  • The OLED sucks. I own a C8 and tried everything available to be happy with it either via emulation or real hardware with an OSSC. Motion sucks, and due to two separate issues that Nesguy has discussed:
  1. “Persistence blur” AKA sample-and-hold blur (distinct from “motion blur” made famous by LCD) on fast moving objects and scenes. The only solution is BFI which has the brightness downside already discussed. It’s a dealbreaker. There is a way to almost eliminate ABL if your panel is calibrated (or you make it think it’s calibrated), but the BFI still saps too much brightness to be satisfying. Forget about frame interpolation as another solution.
  2. Stutter. A lot of LCDs have this problem, some even more than LG OLEDs, some less (a few, not at all) due to varying frame hold times (not to be confused with sample-and-hold) for 60fps content. See: https://www.rtings.com/tv/tests/motion/stutter This affects slow scrolling/panning elements and scenes. Next time you watch a movie look at the credits scrolling on that beautiful inky black and you’ll get it immediately. Sounds like the QLEDs are not that much better than the OLEDs in this regard. Again, people tell you to use frame interpolation to work around this, but that’s no good at all especially for CRT emulation.

So it turns out motion is a big deal in gaming, both retro and modern. If you have the setup to do so, try a 720p or 1080p game on a compatible CRT and after a few minutes you’ll want to both smile and cry. You’ll want a QLED for games with fast motion, but it won’t be like a CRT at all. Even the ideal BFI setting will only get you halfway to the motion clarity of a CRT. It’s pathetic.

Lesson: let’s keep dreaming of the future while we keep putting up with our goddamn CRTs. Otherwise, if you have to have something else right now, get either a QLED or an OLED depending on the types of games you play most. Most likely you’ll want the QLED.

1 Like

Always! It’s a 24/7 party around here. :smiley:

There are QLED TVs that are matching the color accuracy of OLED, and with excellent black levels that surpass a CRT’s. I think emissive displays do tend to look better in general, though, for reasons that are difficult to pinpoint. OLED clearly has the upper hand when it comes to viewing angle; there’s no loss of image quality at all when viewed at an angle.

From what I’ve seen, the frame-hold time at 60fps is much lower on QLED vs. OLED, so the stutter shouldn’t be as bad. I’ve also noticed that hardware-based strobing methods tend to make this problem worse, while software-based BFI tends to make it better, but I’m not sure why.

In a side by side comparison with my actual CRT monitor vs my LCD with BFI enabled, the LCD actually looks like it has greater motion clarity, but that might be down to screen size and the phosphor persistence on the CRT. I use “soft eyes” a lot when playing retro games (particularly shoot em ups), and motion blur reduction is most beneficial when eye-tracking an individual object, so individual play style tactics and the type of game also come into play, here. Someone who plays a lot of FPS games as a sniper, for example, will see greater benefit in reducing motion blur even more than what 120Hz + BFI is capable of.

The ideal would be 240Hz with ON:OFF:OFF:OFF cadence for 60fps content, but that’s a 75% reduction in brightness, and I’m not sure there are any 240Hz displays that have sufficient brightness to compensate for this and CRT emulation.

That’s a fair summary.

I have a CRT on my desk and 3 more in my closet; they’re fantastic displays but they’re getting old and all of them need a bit of work. It’s actually getting hard to find a good CRT monitor these days that doesn’t need a bit of work. Craigslist has pretty much completely dried up where I live. I can clean and recap the monitors, but eventually one of the components is just going to fail entirely, and sourcing a replacement could be difficult and expensive, if not impossible. I’m encouraged by the progress being made by modern displays, and I’m optimistic that with a few more years of R&D into quantum dots, we’ll have displays that do everything we need for a true CRT replacement. We’re getting very close with current displays, though!

Update:

Added the LG SM9500 to the list, making it the only non-QLED display to make the cut.

Update:

It’s worth pointing out that there is still not a SINGLE TV available on the market that is bright enough to do scanlines, black frame insertion, AND mask emulation. You’d need about 2,000 cd/m2 sustained SDR brightness (full-screen).

It’s looking increasingly doubtful that there will ever be a modern TV capable of fully replacing the CRT in all areas of performance. There are a few displays intended for outdoor use that might be able to get bright enough, but none of them have the low input lag needed for next-frame response at 60fps.

So, hang out to those CRTs and learn how to repair them.

EDIT: I should clarify that when I’m talking about mask emulation, I mean setting the mask strength to 100%. If you’re willing to compromise on the mask strength a bit, it’s possible to get a very acceptable result when using mask + scanlines + BFI.

3 Likes

100 nits is a specification of SDR TV’s so I doubt you will see one get into HDR luminance territory. In my opinion, the best course of action is to use a HDR TV specced at 2000 nits and use a fitting inverse function to preserve tonality.

Can you elaborate on this a bit?

What we’re interested in is the sustained SDR brightness of an HDR TV since an emulator running in RA is SDR content. I’ve seen several that reach 2000 nits with HDR content, but 500-600 nits seems to be the max for SDR content, even among the new ultra-bright QLED TVs.

Also, what does this mean? “a fitting inverse function to preserve tonality”

I don’t own an HDR TV, you say that SDR plays on HDR TVs at 600 nits max? Couldn’t you just fake SDR content as HDR by applying an inverse function transform (and matching color space transforms) ?

If the TV applies a constant tonemapper (PQ, HLG), use the inverse of the tonemapper, this is usually an exponential function. I just don’t know by what terms an HDR TV says what is HDR and what is SDR content, or if retroarch glsl/slang has HDR support.

It depends on the TV, but yeah that’s about as bright as they get with SDR content, and it’s only the QLED TVs that get that bright (OLED has ABL that prevents you from doing this).

It’s my understanding that it doesn’t, sadly :frowning:

This is way over my head; can you run this by someone who knows about this stuff? @hunterk

If we can fake SDR content as HDR, that would be a godsend for CRT emulation.

I’m speculating, first and foremost I don’t know if the HDR transfer functions can be reverted, PQ is non linear but I recall reading on pages about converting SDR content to HDR.

Then the inverse function has to overshoot so not to cancel perfectly and reach display max nits (2000), that means a parametric inverse function and not a LUT.

And third, I guess retroarch should be HDR aware and signal that to the display.

All speculation and far fetched, but just wanted to let you know in case I might also learn something out of it.

2 Likes

Take a look at this TV and see if it warrants your attention and consideration.

You can take a look at this one as well:

Here’s one more to check out:

This is a nice review for comparisons:

And here’s one of the newer brighter OLED TVs

Take a look at this thread @Digitech. It might help you to make a more informed decision when it comes to purchasing a new TV.

1 Like

Thanks a lot!, going to check em all and see if they are available in mexico.

1 Like