BFI/Strobing/Motionblur thread reorganized

How many people are using shaders with black frame insertion enabled? BFI and 120Hz gives you CRT-like motion clarity, and I can’t use an LCD monitor without it. BFI is necessary to preserve mask details when scrolling, but both BFI and the mask cause a significant decrease in brightness, resulting in a catch-22.

At 60Hz with no black frame insertion, I like to max out the mask (aperture grille from dotmask shader) and scanlines strength and then max out the backlight. However, this only looks good when there’s little/no movement; when things start moving, the mask is smeared away and just darkens everything. This trick doesn’t work with BFI+120Hz because the screen (an LED-lit LCD) just can’t get bright enough, plus the screen door effect from BFI interferes with the mask. The good news is that next-gen displays will be brighter and have better strobing technology.

I’m currently just using TVout-tweaks-multipass + scanline-sine-abs with BFI+120Hz+Nvidia lightboost. The screen door effect kind of looks like a high-res dotmask and actually looks kind of nice, IMO. With these settings, the LCD looks very close to what I see on my 31 kHz PC CRT; after adjusting the monitor’s color profile for lightboost, the only thing worse on the LCD is the black level.

alias0 = ""
alias1 = ""
alias2 = ""
amp = "1.100000"
CRT_GAMMA = "2.400000"
filter_linear2 = "true"
float_framebuffer0 = "true"
float_framebuffer1 = "false"
float_framebuffer2 = "false"
lines_black = "0.000000"
lines_white = "1.000000"
mipmap_input0 = "false"
mipmap_input1 = "false"
mipmap_input2 = "false"
parameters = "TVOUT_COMPOSITE_CONNECTION;TVOUT_TV_COLOR_LEVELS;CRT_GAMMA;TVOUT_RESOLUTION;TVOUT_RESOLUTION_Y;TVOUT_RESOLUTION_I;TVOUT_RESOLUTION_Q;amp;phase;lines_black;lines_white"
phase = "0.000000"
scale_type_x0 = "source"
scale_type_x1 = "viewport"
scale_type_y0 = "source"
scale_type_y1 = "source"
scale_x0 = "1.000000"
scale_x1 = "1.000000"
scale_y0 = "1.000000"
scale_y1 = "1.000000"
shader0 = "C:\Program Files\RetroArch\shaders\shaders_glsl\crt\shaders/tvout-tweaks-multipass/tvout-tweaks-pass-0.glsl"
shader1 = "C:\Program Files\RetroArch\shaders\shaders_glsl\crt\shaders/tvout-tweaks-multipass/tvout-tweaks-pass-1.glsl"
shader2 = "C:\Program Files\RetroArch\shaders\shaders_glsl\misc\scanlines-sine-abs.glsl"
shaders = "3"
srgb_framebuffer0 = "true"
srgb_framebuffer1 = "false"
srgb_framebuffer2 = "false"
TVOUT_COMPOSITE_CONNECTION = "0.000000"
TVOUT_RESOLUTION = "960.000000"
TVOUT_RESOLUTION_I = "83.199997"
TVOUT_RESOLUTION_Q = "25.600000"
TVOUT_RESOLUTION_Y = "256.000000"
TVOUT_TV_COLOR_LEVELS = "0.000000"
wrap_mode0 = "clamp_to_border"
wrap_mode1 = "clamp_to_border"
wrap_mode2 = "clamp_to_border"
4 Likes

For the more curious, this may interest you (RTSS Scanline Sync)

1 Like

How many people are using shaders with black frame insertion enabled?

Honestly would 100% if it didn’t give me pretty real image retention problems within a minute of use. This is with overdrive turned off through the OSD. Next-gen can’t come fast enough… or micro-LED, potentially?

There’s actually a bounty on this to get it implemented in RA, but I think the devs aren’t interested for reasons I don’t completely understand. It interferes with gsync, or something…?

1 Like

What monitor are you using? Just curious.

I’m using an older lightboost monitor, the ASUS VG248QE. It’s one of the first monitors that supported lightboost, and they’ve improved the tech on more recent monitors.

It’s definitely got some strobe-related ghosting, but I’ve yet to see any really bad IR. The ghosted images usually fade within a fraction of a second when the screen is in motion. I’ve got lightboost at the maximum setting (more blur but brighter image). The UFO test looks perfect to me at this setting, so I don’t see any need to lower it.

The biggest hurdles to clear for accurate CRT emulation with BFI enabled are brightness and screen door effect. I would guess that you need around twice the brightness of current LED-lit LCDs to have accurate mask emulation, where the individual emulated phosphors are showing accurate RGB and luminance values. Screen door effect greatly interferes with the mask and is something that needs to be eliminated with better strobe tech.

Higher resolutions are another hurdle to be cleared, in order to accurately recreate the glow effect. In real life, the glow effect is curved, but fixed pixel displays and finite signal resolutions force us to turn all curved lines into jagged ones. The higher the resolution, the more realistic you can make the glow effect. At resolutions high enough that you can’t see the pixels with your nose to the screen (Retina Display), the glow effect can become indistinguishable from what you see in real life.

Even with all these hurdles cleared, it still won’t look exactly like a CRT for various reasons, one being that the phosphor on a CRT actually reaches around 10,000 Nits for just a tiny fraction of a second and then rapidly decreases in brightness. The light hits your eyes in a totally different way compared to a backlit display; I don’t think that’s something that can ever be accurately recreated on an LCD.

Side note/interesting observation: I’ve observed that even a still image looks better with BFI enabled; something about the strobe effect makes even a static image easier on the eyes, IMO. I know that I get eyestrain-related headaches a lot more frequently when viewing sample and hold LCDs vs displays that strobe.

1 Like

I’m using an Acer Predator XB271HU. I also tend to use NTSC filters & interlacing shaders a bunch, so that might aggravate it for all I know. Though I still get IR without either of them.

Re: BFI being easier on the eyes: Couldn’t agree more, since I get incredibly woozy from moderate to fast scrolling as it stands right now. When I realized I could play around with ULMB & BFI on my monitor, I got pretty excited until the IR artifacts showed up…

Edit Might as well add that I watch anime through RA, and I found that the the Kurozumi preset set to interlacing produced brutal IR after just one episode of Evangelion. Just a superimposed screen of a single interlaced field.

1 Like

Hmm, that does seem odd, and might be something specific to your particular monitor. Current strobe tech can be a bit hit or miss. I get some ghosting artifacts on my display but they go away within a fraction of a second with any amount of motion, and I have yet to see any really bad IR.

You might try increasing/decreasing the lightboost effect (whatever it’s called on your display) to see if that changes things. If it’s already at 0%, maybe try disabling it altogether? You might get better results with just 120Hz + BFI + boosting your monitor’s backlight/contrast/color levels.

Backlight strobing shouldn’t affect image retention, AFAIK, but BFI via RetroArch (i.e., the built-in function) and/or interlacing simulation seems to do it on some displays, presumably because it’s asking the pixels to function in a way that they’re simply not designed for. /shrug

I used to have that same VG248QE monitor, which I bought specifically for use with BFI. I hated what it did to colors and the screendoor effect, but I ended up selling it when I realized that it gave me headaches after just a few minutes. I don’t have this issue with real CRTs at 60 Hz, though. I can stare at them for hours on end…

3 Likes

I actually don’t mind the screendoor effect; it just looks a lot like a PC monitor’s dotmask to me. It does interfere with mask emulation, though, which is a bummer if you’re trying to emulate a low-res CRT.

The color shift from lightboost can be pretty bad, but then I just downloaded and installed this profile and it’s a big improvement; it completely eliminates the purple tint. I also adjusted my lightboost setting to 100% and contrast setting to 100%. Of course, the black level and resulting contrast ratio are utter shit, but overall I think the image looks pretty good; very close to what it looks like @ 60Hz with the backlight at 100%. In a dark room, I might lower lightboost as low as 50%.

Strangely, I can easily notice the flickering on a CRT monitor through my peripheral vision, but I can’t see it on an LCD with BFI no matter how hard I try, and BFI has greatly reduced my eyestrain-related headaches. This seems to be something that people can react differently to.

2 Likes

I finally broke down and decided to run the majority of cores using ULMB instead of VRR since I seem to be able to avoid IR this way (thanks hunterk!). This will become a problem for some systems & games, naturally, but it is what it is.

Is there a checklist to guarantee ULMB working properly/as well as possible? I’ve turned off “Sync to exact content framerate”, made sure VSync is on, made sure the refresh rate is set to 120hz in RA (or rather 119.8 or some such), and then… well, does VSync Swap Interval play a role? If I leave that one @ 1 the FPS counter tells me RA is still trying to match the game’s native refresh rate - but @ 2 it seems to stay as close to 60 as possible.

1 Like

I don’t think there’s a checklist/guide for black frame insertion, but that’s a good idea.

I just leave vertical refresh rate at 60.0000. This seems to result in the best performance and eliminates skipped/dropped frames; I get a steady 60 FPS this way. If I set it to 120.000 or use the display-reported refresh rate, I get occasional hiccups. Setting vertical refresh to 60.000 with BFI+120Hz may have the same effect as vsync swap interval 2; I’m not sure.

Other tips: You probably already know this, but you have to adjust your monitor’s settings for lightboost to look good. I settled on lightboost 100% and contrast 100%, as the additional reduction in motion blur from reducing lightboost is negligible IMO and not worth the reduction in brightness. This destroys the black level but the colors are well-saturated and the image is acceptably bright (around 120 cd/m2 max white). In a dark room, I think I can get away with 50% lightboost. If I’m not using BFI/Retroarch, lightboost at 10% is very acceptable.

I played around some with color-mangler to see if I could improve things further, but didn’t find it to be particularly useful in this endeavor. If your display is already well-calibrated, adjusting the colors/contrast/saturation/etc via shaders seems to only make things worse. You can’t overcome the display’s limitations by using shaders.

However, TV-out-tweaks-multipass has proven to be very useful. Enabling the “TV out color levels” option results in one of the best-looking and easiest brightness and color adjustment I’ve seen, and the composite option lets you adjust luma and chroma. The signal resolution parameter and gamma adjustments are also very useful. I’ve set every system to use TV-out-tweaks-multipass.

1 Like

While strobing tech can be expected to improve, I am less optimistic about its future suitability for retro gaming.

What we want to emulate CRT scan which helps clean up image retention from our eyes is strobing (distinct from BFI, which is an inferior solution) at 60Hz, to match the 60fps of most retro games. But it looks like manufacturers will instead be focusing on faster strobing, aimed at modern gaming, sports, movies, etc. Only a few older models did strobing at 60Hz (often only accessible through service menus or other unorthodox means). This may look too flickery for some people.

What we really want is a rolling scan strobe at 60Hz. I only know of a couple of earlier professional OLED monitors that have this feature, and I see no reason why manufacturers would have an incentive to develop and introduce such a feature on future consumer products. The “purist” (and by playing on non-CRT displays you would already not be much of a purist) retro gamer market is too small.

@Umwelt

Is there any way to DIY a rolling scan strobe? Before lightboost there were people hacking displays with brighter LEDs and strobing backlights.

Is it possible to have a panel custom built with the specs needed for rolling scan strobing? CRTs won’t last forever. I could see a niche market for such custom built displays, especially as good CRTs become increasingly scarce. If people will spend $400 on a Framemeister… :confused:

Even without strobing, BFI will still be a solution at higher refresh rates and brighter displays should make the brightness reduction a non-issue.

I have no idea if a DIY approach would be possible for this. I imagine that as monitors get fancier it would be even more difficult to do any kind of hack on them or something not intended by the manufacturer.

But yeah, BFI could become a “good enough” solution for motion persistence, as long as adjustable flicker rates are offered.

2 Likes

Maybe the future of crt emulation will be with virtual reality headsets :thinking:

"CRT-Clarity OLED Gradually Arriving

Today, Oculus Rift and HTC Vive virtual realty headsets now already achieve CRT motion clarity via their pulsed rolling-scan technique. Most VR headsets need to eliminate display motion blur, to avoid nausea in virtual reality.

Some 2018 and 2019 OLEDs such as LG HDTVs have a black frame insertion mode. Newer OLEDs are having longer black frame intervals with shorter refresh cycle visibilities, to keep reducing OLED motion blur, especially as OLEDs get brighter with enough headroom for motion blur reduction.

We need an optional low-persistence mode even for ultra-high-Hz OLEDs too such as JOLED’s upcoming eSports gaming OLED monitor. Although doubling the refresh rate will halve motion blur, it is still necessary to use impulse techniques to reduce motion blur further.

For more reading on display motion blur behaviors and example animations, see Blur Busters Law: The Amazing Journey To Future 1000Hz Displays." Via -> https://www.blurbusters.com/faq/oled-motion-blur/

Also "CRT style gaming on LCD

LightBoost provides gaming with a CRT-like clarity, with zero motion blur — allowing complete immersion without being distracted by motion blur. Improved competitive gaming scores are possible thanks to a faster reaction time. See improved Battlefield 3 statistics graphs.

A high end GPU is required (e.g. Geforce GTX 680, 780 or Titan) to frequently hit [email protected] most of the time to really notice the big improvement in motion clarity, with perfectly clear images even during fast turning and strafing. Also, it is necessary to disable the GPU artifical motion blur effects in video games, as that can ruin the LightBoost motion blur elimination. Also, some source-engine games needs their fps_max raised at the developer console, to play smooth." Via ->https://www.blurbusters.com/zero-motion-blur/lightboost/

More useful links for the curious

1 https://www.blurbusters.com/zero-motion-blur/10vs50vs100/

2 https://forums.blurbusters.com/viewtopic.php?t=3213

2 Likes

Also, I think 480Hz without strobing or BFI should have practically zero motion blur. Just give it another decade…

More likely, these technologies will start being integrated into high-end gaming displays. Near term, 1-5 years from now, we’ll still depend on strobing and high refresh rates to reduce motion blur.

In 5-10 years we’ll have 480Hz displays which will have practically zero motion blur at 60fps. In 10-20 years, we’ll have 1000Hz displays.

Strobing and BFI are really just stopgap technologies on the road to 480+Hz displays. I feel pretty good about the future of display tech and it’s suitability for retro gaming.

1 Like

After some experimenting, I’ve decided to disable lightboost on the ASUS VG248QE.

Using a light meter, I took 3 measurements each for BFI only, lightboost only, and BFI + lightboost. I then found the averages.

Results:

lightboost + BFI: 161 lux
BFI only: 212 lux
lightboost only: 232 lux

This is with contrast and brightness at 100%. Lightboost was set to 100%.

Summary: Lightboost only gives you the brightest image, but it has bad motion artifacts. To get it as smooth as BFI only, you have to turn down lightboost %.

BFI only is just slightly dimmer than lightboost only, but results in perfectly smooth motion.

lightboost + BFI is significantly dimmer than either lightboost only or BFI only, and I can’t tell the difference in motion clarity.

The winner is BFI only, at least on the ASUS VG248QE. Newer monitors with improved strobing technology may have superior motion clarity and/or brightness with either strobing only or strobing + BFI.

Chief Blur Buster here.

I have some good news (for both emulator developers and emulator users).

Please see: https://forums.blurbusters.com/viewtopic.php?f=4&t=5299&p=40755#p40755

2 Likes

Unfortunately, if you avoid strobing, the motion blur is intractably linked to frame rate.

Nonstrobed 60fps at 480Hz looks identical to 60fps at 60Hz, no matter the display technology. Even for 0ms GtG. That’s because a non-strobed 480Hz display continuously display the same emulator “60Hz” frame seamlessly for several refresh cycles – it just looks like a single 60Hz refresh cycle.

See GtG vs MPRT: Frequently Asked Questions About Pixel Response.

As inventor of TestUFO myself, another great demo to understand the framerate-vs-blur effect is the Variable Refresh Rate Simulation Animation. The higher the frame rate, the less motion blur. Double the frame rate, you halve motion blur on a non-strobed display. Anybody who sees www.testufo.com on a 240Hz display understands that 240fps has a quarter of the motion blur of 60fps, if you’re using nonstrobed framerate to reduce motion blur.

The only way to reduce motion blur is to reduce pixel visibility time via:

  1. Add black time between refresh cycles (phosphor, pulsing, BFI, strobing, etc)
  2. Add more frames between frames to shorten frame visibility time.

Unfortunately, 8-bit games only run at 50fps or 60fps, so you are stuck with option (1) unless you use interpolation or another Frame Rate Amplification Technology, preferably one of the perceptually lagless methods.

In the longer term (by 2030), the great news is that retina refresh rates (1000Hz+) can also help emulate a CRT more accurately. Retina resolution will do the HLSL stuff, while retina refresh rate will do the CRT emulation temporally to vision acuity levels (software rolling scan emulation, software phosphor fade emulation).

With retina refresh rates – the sheer number of physical refresh cycles per 60Hz emulator refresh – makes possible finer-granularity emulation of historic temporal behaviour of historic impulse-driven displays (e.g. CRT).

Chief Blur Buster

3 Likes