BFI/Strobing/Motionblur thread reorganized

This makes sense, and I learned a lot from the post at the Blur Busters forums. Really great info; I’m sold on 240Hz!

All you need is 120Hz or higher, and adjustable software-based black frame insertion, which seems like it should be fairly easy to implement in emulators. This should be sufficient to take care of motion blur until we get to 1000Hz displays.

I think the next hurdle to clear for CRT emulation is brightness. I’m excited about the brighter overdriven strobed displays now available, but does the strobing add input lag compared to software-based black frame insertion? If so, I’d probably prefer to go with the (non-strobing) brightest 240Hz display available with >16ms input lag. 500 nits non-strobing should be sufficient for scanlines and BFI @ 120Hz, and still provide plenty of punch.

However, don’t you need even more brightness if you’re using higher refresh rates and adjustable black frame insertion to get 60fps? If you’re doing OFF:OFF:OFF:ON @ 240Hz with 60fps, isn’t that a 75% reduction in brightness? If you add scanlines that’s now a ~90% reduction in brightness and if you add the RGB mask, ~95%, so now you need 2,000-3,000 nits :joy:. I hope I’m wrong about this…

Ultimately, to do CRT emulation with BFI, scanlines and realistic RGB mask emulation it might take something like the custom-built, overdriven, water-cooled, edge-lit LCD you described, with retina resolution and >16ms input lag :laughing:. You probably need 1000 - 2000 nits if using BFI @ 120Hz, to ballpark it. It would be cumbersome and ugly, but you could hide it inside of a CRT-like case! :stuck_out_tongue: I really hope someone tries this.

3 Likes

Generally correct for software based BFI, yes. If you’re not relying purely on hardware-based voltage-boosted strobing.

Also, to reduce input lag for 60Hz@240Hz software BFI, you want to do “ON:OFF:OFF:OFF” rather than “OFF:OFF:OFF:ON” – show the visible frame quicker. A subtle thing, but something to consider in development.

Oh, and if you’re using a TN LCD with 6-bit FRC to do 60fps at 240Hz, one may prefer “ON:ON:OFF:OFF”. That way you get 8-bit color resolution instead of 6-bit color resolution. And you eliminate the inversion artifacts because the even/odd refresh cycles are properly paired up. Then again, many 8-bit games only use a few colors…

Yes, that is a definite concern.

So you’d want a HDR display to get the headroom. Those 4000 nit and 10,000 nit prototype displays I saw at CES would be damn useful for these kinds of things. They didn’t need water cooling because they were designed with brighter LEDs right at the outset.

The homebrew water cooling idea was only for reusing an existing LCD edgelight to be vastly brighter than it normally is … but you could theoretically just rip it out and simply use a large heatsinked LED array line focussed into the edge through reflectors/prism, perhaps. The Talbot-Plateau strobing law is indeed pesky here. But at least manageable because LEDs are getting brighter and brighter. Imagine stadium LED brightness piped into the panel. But, hacking an LCD backlight is an expensive endeavour – few modders wants to risk their $500 gaming monitor to a failed modification!

Another out-of-the-box solution is the new 4K 144Hz local dimmed HDR displays are now 1000 nits, if one would like to use those with emulation… Better contrast ratios closer to CRT, better colors of IPS, great resolution for HLSL, stays 8-bit color with software BFI, and plenty of nit headroom to get ~500nit with software BFI. No ULMB though, so not as big reduction in motion blur. That said, these are desktop monitors costing $2000 – they use a 384-zone local dimming backlight, rather than a simple edgelight.

3 Likes

Hey thanks for the explanations ,It was interesting to read :+1:

(i’m going in cryogenic sleep,wake me up in 2030 ! ):older_man: haha

1 Like

@hunterk

Can we get this discussion (starting at post 545) moved to this thread?

Don’t want to get too off topic/derail the current thread any further.

Thanks! :smiley:

2 Likes

@hunterek

Can we also movet the additional posts for earlier BFI context!

The person who made the assumption about 480Hz, etc

Please merge at least 4 (or so) more earlier posts on this topic

1 Like

Ok, hopefully this timeline makes sense. I tried to merge all of the posts into a single thread, but this forum software kinda sucks at this sort of thing…

1 Like

The monitor industry seems like it’s in a bit of a funk, these days. At these monitor prices, I’m starting to think I’d be better off just going for a TV… Looking at the specs compared to TVs these days, I’m not sure what is justifying these high monitor prices.

What are your thoughts on the LG C9 55" 4K OLED TV?

I’m concerned that the brightness limiter will prevent me from pushing the brightness levels that I’m seeking, and/or that I won’t be able to push it to max brightness levels unless displaying HDR content (no good for what I’m trying to do with emulators).

It’s very monitor specific. IR artifacts are more common with software BFI on 120Hz TN panels. It’s because of the way software BFI “accidentally” defeats the positive/negative alternating voltage of LCD inversion voltage balancing algorithms.

IR artifacts disappear if you use:

  1. 120Hz IPS panels
  2. 60fps BFI at 180Hz on the BenQ XL2540/XL2546/XL2740
  3. 60fps ON:ON:OFF:OFF BFI at 240Hz on any 240Hz TN monitor

If you pair up even/odd refresh cycles (the next visible refresh cycle is an odd-numbered refresh cycle, compared to the last visible refresh cycle), the image retention artifacts stop appearing.

And the inversion artifacts disappear too.

image

That checkerboard-texturing problem also disappears too, with option (2) and (3) and the color depth actually increases too (looks 8bit instead of 6bit) since softwre BFI can interfere with 6-bit FRC.

That’s why I prefer to advocate either (A) IPS panels, or (B) 240Hz TN monitors if you want to use software BFI with emulators.

Also, there are some IR-retention compensation algorithms for 60fps@120Hz by automatically swapping positive/negative phases occasionally (basically adding a single Hz skip). Using a 50% alphablended BFI temporarily for that single Hz skip, also eliminate the flicker of the single-Hz skip algorithm to automatically prevent IR artifacts. Single-Hz skip for anti-IR can be done just once every 30 seconds or 60 seconds.

A more advanced anti-IR algorithm is now in the open source code of the xash3d BFI source code in github. Best anti-IR algorithm I’ve ever seen! It even resists IR on some of the worst-IR monitors. (I must say myself: I taught the programmer how to eliminate IR with software BFI!)

Other people don’t understand the artifacts. Blur Busters is the one that understands the WHY. We understand displays. We understand the black box between Present()-to-Photons! :wink:

2 Likes

I’m running an Acer Predator XB271HU which is a 120Hz IPS panel and I still get heavy IR. :confused:

I’ve read some of your stuff on the BlurBusters forums regarding IR compensation algorithms, and it makes all kinds of sense. AFAIK this is not yet possible in RA.

Forgive me if this lies outside of your area of expertise, but would you say NTSC filter effects could play a part in aggravating IR problems?

Edit You may disregard what I wrote about IR on my IPS panel. I just played around with some NES games using BFI + ULMB @ 120hz with VSync Swap Interval at 1… and I couldn’t see a trace of it after several minutes of play. I don’t know what I’ve been doing wrong previously.

1 Like

Definitely not.

IR is caused by interference with the positive/negative voltage balancing of LCD inversion. The black refresh cycle unbalances one of the polarities.

1 Like

How much input latency is added by hardware strobing? If it’s just a couple ms, it might not matter, so long as the total input latency is less than 16ms. I don’t want to completely rule out hardware strobing yet.

Also, when it comes to HDR displays, I’m concerned that I won’t be able to actually utilize the extra brightness that HDR offers with non-HDR content/applications (such as emulators). Will these displays be able to reach the desired brightness levels when displaying non-HDR content?

When it comes to OLED, I’m concerned that the brightness limiters in these displays will prevent me from pushing the max brightness levels that I’m seeking, and there’s still the burn-in issue.

Half a refresh cycle scanout average.

The faster the scanout is done in the dark before flash, the less lag. 120Hz strobe adds average of 4ms lag for screen center, and 240Hz strobe adds average half a refresh cycle of lag.

The strobing lag is because of two things:

(1) LCDs dont refresh all pixels all at once. See high speed videos of an LCD refresh sequence at www.blurbusters.com/scanout

(2) After the scanout being done in the dark, there may sometimes be a small delay to wait for GtG to mostly finish before flashing the backlight.

On average, strobe lag is approximately half a refresh cycle. 120Hz strobing has an average midscreen lag penalty of 4ms. (more lag for screen top, less lag for screen bottom). The higher the Hz, the lower the strobe lag. So 120Hz strobing + 60Hz BFI will still preserve the low lag of high-Hz strobe, despite emulator being only 60fps.

1 Like

As we are the guys that understand the black box between Present()-to-Photons…

No shrug needed at all! We can answer your questions.

See the Lagom and Techmind link at top of www.testufo.com/inversion to understand. Each refresh cycle alternate in voltage polarity. If you have questions about why BFI artifacts show up, just ask. We understand why. It’s not the interlace, it’s not the strobing, it’s the software BFI that is accidentally in sync with the positive/negative voltage balancing system (LCD inversion algorithm).

On some LCDs, this unbalances the LCD when one polarity has image and the other voltage polarity has no image. Unbalanced voltage buildup is like static electricity = image retention. When one of the voltage polarities (e.g. positive voltage) is black and the other voltage polarity (e.g. negative voltage) has an image. It’s pesky when LCD inversion and software BFI accidentally interferes with each other. Good monitors will automatically change the inversion algorithm to compensate if it detects an unbalance situation. But not all do…

2 Likes

There’s an anti-BFI-burnin algorithm to eliminate image retention, currently used in xash3d – and can be used to improve emulator BFI.

Also, source code for image-retention-resistant software BFI:

Hope this helps other developers!

Summary, anti-burn-in for software BFI is essentially

  • Rebalancing the even/odd refresh cycles via multiple methods like 60fps@180Hz (which means visible frames are opposite polarity in next refresh cycle), 2x60fps@240Hz (two refresh cycle repeats), or if using 60fps@120Hz inserting a repeat-refresh once every X seconds (60secs), possibly with alphablend (50% BFI) to prevent flicker.
1 Like

looks like your comment got cut off, there. Am I right that 240Hz strobe adds an avg of 2ms lag? Is this regardless of fps? (ie, is 120fps with 240Hz strobe the same input lag as 60fps with 240Hz strobe)

I couldn’t find the input lag of the BenQ Zowie XL2546, which you recommended in the Blur Busters post. It’d be really nice to know if the total input lag with strobing is <16ms.

Also, you recommend software BFI with the BenQ Zowie XL2546, but does the voltage-boosted strobe brightness work with software BFI? Ie, will the monitor reach 300-400 nits with software BFI? That’s what you need to add scanlines and still have an acceptably bright display.

Can the HDR displays reach 1,000 nits when displaying SDR content and applications (such as emulators…)?

So many considerations to juggle!

Looking at gaming displays, there just aren’t many good options that have the required brightness while strobing. The ASUS ROG PG27UQ could be the one to rule them all, but like you said, it’s $2,000 for a monitor, when you could spend just as much (or less!) on a really nice 50-60" 4K HDR TV.

It seems like monitor manufacturers are charging a high premium for features that only really benefit modern gaming, which makes sense. For retro gaming via emulators, however, a new HDR TV probably makes more sense.

Went through all the reviews on RTings.com, and these are probably the best TVs you can get for CRT emulation and low input lag.

All of these have very high brightness (>500 nits with SDR), low input lag (<16ms at 1080p@120Hz), and 120Hz + black frame insertion.

Vizio P Series Quantum
Samsung Q90/Q90R QLED
Samsung Q9FN/Q9/Q9F QLED
Samsung Q900/Q900R 8k QLED
Samsung Q7FN/Q7/Q7F QLED
Samsung Q7CN/Q7C QLED
LG SK9500
1 Like

You got it! Those are exactly the televisions you want for emulation.

I’d throw in all 2017,2018,2019 LG OLEDs too, because LG 4K OLEDs made since 2017 is capable of 120Hz at 1080p. That gives you great software BFI, as OLED usually behaves very well with software BFI. Looks stunning with emulators that have software BFI.

Most of those may not have hardware 60Hz BFI but they are very software-BFI friendly. Besides, 8bit emulators work fine with 1080p on these OLEDs anyway, and while the resolution of scanline filters may be degraded by that, it’s not all that bad, especially at normal TV viewing distances.

With software BFI on 120Hz OLEDs, you’ll get an exact halving of lumens, and an exact halving of display motion blur. But no color depth loss (like TN 6-bit LCDs), as OLEDs are already full color depth per refresh cycle.

1 Like

Fixed, but I did already mention it later in the same post. Yes, 240Hz strobed adds an average 2ms of lag. Even with 60fps@240Hz if you use 1:0:0:0 BFI. Basically, one visible refresh, four black frames.

Now, you still get inversion artifacts and potential burn in if you do that. Much better quality (And slightly brighter) is 60fps@180Hz using 1:0:0 BFI. This solves the odd/even alternating problem as the odd-numbered BFI sequence (A) eliminates the pesky checkerboard-pixel-texture problem (B) fixes the color depth loss by staying 8-bit instead of looking 6-bit, © eliminates image retention, (D) is slightly brighter.

So I tend to like 60fps@180Hz BFI when using an XL2546. The 60fps@240Hz has slightly less motion blur but is darker.

The alternative is to turn off DyAc (Strobing) and use repeat refreshes with a 1:1:0:0 BFI cadence instead to make sure you have the paired-“even/odd”-refresh-cycle to solve (A) checkerboard texturing (B) 6-bit depth loss and © prevent image retention from accidental unbalancing of the alternating positive/negative polarity refresh cycles of LCD voltage inversion algorithms.

No, I wrote about this already. Your brightness loss will be proportional. 300 nits voltage boosted will still be 150nits at 50% BFI or 75nits at 75% BFI. Still brighter than the 25-50nits that some BFI+strobing gives you. You get the voltage boosting but the loss is proportional to the BFI.

You can also run the XL2546s at 60fps@120Hz for a brighter software BFI too, and adjust persistence using Blur Busters Strobe Utility https://www.blurbusters.com/strobe-utility (or use Factory Menu)… to get a brighter strobe with a bit more motion blur. I guesstimate getting to 150 nits (still much brighter than most Lightboost+Software BFI) though you’ll have to settle for the 6-bit look + checkerboard pixel pattern (and potential IR) – not sure how prone the XL2546 is prone to IR as I have not done extensive 60fps@120Hz tests with it.

Hope this helps!

P.S. I don’t understand why this deserves to be in the OffTopic thread. These are all relevant to today’s emulators on today’s displays, with existing LibRetro features.

3 Likes

That’s good to know! Is that something unique to OLEDs?

Haha, sorry. I was pretty tired when I wrote that post, and I feel like my head is exploding with new information :stuck_out_tongue:

Okay, this makes sense now. I didn’t realize that it was that low without the voltage boosting. The XL2546 looks like a really great display for modern gaming, but you probably need a bit more brightness for scanlines. I think 400 or 500 nits voltage boosted would give an acceptably bright image with scanlines (at 50% BFI).

To make things more difficult, RTings doesn’t always specify if a display uses voltage-boosted strobing or not. The XL2540 looks like it’s bright enough, but I’m not sure if it has the voltage boosted strobe, and it lacks the custom 180Hz mode… now my head is starting to hurt again :smiley:

Very helpful information, thanks!

I think it’s just a general catch-all category. Maybe Retroarch -> General would be a more appropriate category? I think the forums are very focused on Retroarch itself.

2 Likes

So does this mean 60Hz hardware strobe for 60fps adds ~8ms of lag? For example, the LG SK9500 can 60Hz strobe for 60fps , but an 8ms input lag penalty would push the total average input lag to 22.5ms for 1080p @ 60Hz, so you wouldn’t get next-frame response with that mode. Many of the listed TVs also strobe 120Hz for 120fps, including the LG SK9500, which seems like it would work well with software BFI.

This gets somewhat confusing, because black frame insertion (eg 120Hz for 60fps) and strobing with fps = Hz both get called black frame insertion.

Is this true for QLED as well?

I looked at the LG OLEDs for 2018 and 2019, but from I could tell they aren’t bright enough when displaying SDR content, have aggressive ABL, and/or input lag >16ms.