Doesn’t look great on my 240hz monitor, not sure what i’m doing wrong. I’m getting dark colors and obvious horizontal darken stripes that move slowly upwards. And the occasional flicker.
I am going to guess that your monitor is not running at a perfect 240hz but maybe 239.98hz or 240.01hz etc.
Try making a custom 180hz mode see if that fixes it. Alternatively contact blurbusters on twitter or comment on the github thread below.
also, with the dark colors, make sure you adjust the gamma parameter. You should be able to make it neutral there.
I tried this but the colors are still dark.
I can also reduce the banding stripes by lowering my monitor’s RGB gains from 100 to 50. But that makes it even darker.
I have run it at 120 and ot looks a bit darker (which is to be expected) but no flickering. And the motion clarity really is a lot better.
I do have the darker band which travels up the screen though, does this disappear at higher refresh rates? Or is it something with the ramp of the scan effect?
Maybe @BlurBusters knows something about this…
the bar traveling up the screen is part of the LCD persistence prevention. You can turn it off in the parameters.
I did try that and the bar just seemed to stay in one place
that’s right. It might need some other parameter to try and push it offscreen similar to the RTSS tear-line thing.
There might also be something up with individual displays. If it’s a full-black bar, it may be rated to a sort of clipping thing that some monitors are apparently doing.
Ah, good idea, maybe I’ll try that, maybe being adjusted towards the bottom would be much less visible.
Yeah, thankfully I’m not getting a black bar, just a band that is dimmer than the rest
No way to get rid of the horizontal stripe/bar. There’s nothing in the parameters for this. It’s just the shader not being compatible with all monitors.
try adding a parameter to the line:
float crtRasterPos = mod(effectiveFrame, EFFECTIVE_FRAMES_PER_HZ) / EFFECTIVE_FRAMES_PER_HZ;
I think just a float added to the end of it may do the trick. I don’t have that artifact here, so I can’t really test, unfortunately.
More motion blur reduction with less flicker than BFI. Phosphor decay and rolling is much gentler than squarewave BFI. It looks better than BFI, as long as you can calibrate the artifacts out.
- 120Hz – You can get up to 50% motion blur reduction for 60fps
- 240Hz – You can get up to 75% motion blur reduction for 60fps
- 480Hz – You can get up to 87.5% motion blur reduction for 60fps
You can see 120-vs-480Hz is more human visible than 60-vs-120Hz, so do not hold back your Hz upgrade.
Some tips:
-
The band moves vertically because of the LCD Saver system (prevents image retention) but the band should not be blatantly visible, if you properly calibrate (ideally).
-
Make sure you’ve calibrated your black levels and white levels. You can use www.testufo.com/blacks and www.testufo.com/whites –
-
If you have any black clipping or white clipping, you may get some bands. Try slightly lifting your blacks and/or dropping your whites a little bit, and see how that goes. You could try a picture adjustment shader before the CRT simulator shader too.
-
Also, ABL/APL behaviors will be a big problem with some displays because the display can dynamically readjust itself on the fly mid-scanout, ruining the Talbot Plateau mathematics, and creating banding.
-
Try both SDR mode and HDR mode, sometimes it behaves much better in SDR mode. Try much lower GAIN_VS_BLUR values (“Brightness vs Blur” in RetroArch) and see what happens too.
-
Picture adjustment shaders MUST come before CRT emulation filter. No picture adjustments (contrast/saturation/gamma) can occur afterwards, since it could violate Talbot Plateau Law and create banding. When you design shader sequences, sequence things the same way as a real picture tube (picture processing circuit -> electron gun with electron beam -> hits phosphor).
-
Temporarily turn off your CRT filter, and see what happens. Some CRT filters aren’t very Talbot Plateau compatible. If you choose an end-stage filter, make sure it’s not adjusting the picture in post-process while filtering (will distort the Talbot Plateau law because most postprocess color adjustments are in gammaspace, rather than linearspace).
Try various different CRT filters. Certain CRT filters have incompatible maths, some of the CRT phosphor filters seem to not be properly Talbot Plateau preserving (energy level preserving). This is the stupendous challenge of overlapped gradient rolling scan, where I must keep average photoncount consistent despite the shingle-overlapped-phosphor-fade subframes, which requires Talbot Plateau Law perfect mathematics – and postfilters sometimes distort this. Not all of them do, but some of them do – experiment with different filters, and moving the picture-adjustment filters to pre-CRT-beam scanning.
Actually I made the band disappear with some of the above tricks.
It varies by display but sometimes I got the band to be 90-99% fainter.
Try (1) raising your blacks to a dim grey, (2) lowering your whites, (3) trying both HDR and SDR mode, (4) lowering GAIN_VS_GAMMA, (5) don’t do any picture adjustment filters AFTER the beam sim, only BEFORE the beam sim, (6) Turn off all NVIDIA Control Panel adjustments and reset them to defaults, (7) disable any Windows .icc calibration profiles and go to display native colorspace (and do picture adjustment in RetroArch BEFORE the beam filter), and (8) try changing CRT filter shaders (some creates more banding than others) / disabling CRT filters. Some CRT filter shaders violate Talbot Plateau Law with the beam.
Technical/Scientific Rationale;
-
(1) and (2) Is because on some displays RGB(0,0,0) is the same color as RGB(10,10,10). So you have to compress your gamut (e.g. HDTV 16-235 or subtle things like 2-254) to bypass the banding region;
-
(3) Most displays do this beam simulator better in SDR mode but sometimes it looks better in HDR mode;
-
(4) Lowering this number will cause less complicated Talbot Plateau stacking (less blending), especially when done in conjuction with (1)+(2);
-
(5) Picture adjustments after the beam sim will tend to distort the Talbot Plateau perfection and start banding;
-
(6) Same reason as item 5
-
(7) Same reason as item 5
-
(8) Some filters distort the Talbot Plateau behavior at the dark tip of rolling scanout where it overlaps the bright tip of the previous refresh cycle. You may be able to fix some of this by doing the tweak in item (1)+(2) to lift blacks and lower whites. But not all filters fix themselves this way. So temporarily disabling filter may also produce a sudden disappearance of banding on some displays, as the banding was traced to certain CRT filters.
Also, try blurry-pixel filters too, and see if that produces better results than sharp-pixel filters too.
If you still cannot get your banding to disappear despite the above (even with very grey blacks or dimmed whites), it could be also due to a 6-bit behavior in a TN panel. They tend to have difficulty with the subtle shades at the phosphorfade tips.
If your native:simulated ratio is even on an LCD, do not do this. It intentionally ROLLS to prevent image retention on an LCD. This is because my CRT shader is capable of non-integer native:emulated Hz ratios, and I piggyback on this trick to slew the emulated Hz relative to output Hz.
For 60.000fps at 120.000Hz, it’s actually very explicitly intentionally emulating a ~59.985Hz CRT, to prevent image retention from static electricity buildup from voltage-polarity inversion algorithm in LCDs. See LCD Image Retention from BFI. The CRT shader automatically avoids this, as per math algorithm in shader:
// LCD SAVER SYSTEM
// - Prevents image retention from BFI interfering with LCD voltage polarity inversion algorithm
// - When LCD_ANTI_RETENTION is enabled:
// - Automatically prevents FRAMES_PER_HZ from remaining an even integer by conditionally adding a slew float.
// - FRAMES_PER_HZ 2 becomes 2.001, 4 becomes 4.001, and 6 becomes 6.001, etc.
// - Scientific Reason: https://forums.blurbusters.com/viewtopic.php?t=7539 BFI interaction with LCD voltage polarity inversion
// - Known Side effect: You've decoupled the CRT simulators' own VSYNC from the real displays' VSYNC. But magically, there's no tearing artifacts :-)
// - Not needed for OLEDs, safe to turn off, but should be ON by default to be foolproof.
#define LCD_ANTI_RETENTION true
#define LCD_INVERSION_COMPENSATION_SLEW 0.001
To avoid this, get an OLED (and disable this feature, if already exposed via Retroarch menu), or use an odd-number native:emulated Hz ratio. This will automatically turn off the LCD saver mode, as odd ratios are immune.
The slow analog-like slew is usually better than a sudden flicker every 10 seconds (crude framedrop method to switch voltage polarities). If you see bands, then calibrate them out if possible. Then the roll is largely invisible to human eyes. That’s why hunterek does not see it, the bands are invisible, but it’s still rolling (invisibly).
That’s what i was told in the github thread after i posted my monitor model.
I tried most of the solutions already so this is simply my monitor.
Hey guys… I’m finally getting around to checking this feature out. I enabled “shader sub-frames” to get the rolling scanline option… my question is… should I be enabling this universally, or only for certain cores/setups?
You’re not actually using the rolling scan option. Enable the subframes and then use the crt-beam-simulator shader located in shaders_slang/subframe-bfi.
Ah, ok… thanks. I was looking for the shader under the “blob” folder mentioned earlier, somewhere and couldn’t find it. So, once I enable the subframes, I should leave the rolling scanlines option that appears off, or turn it on in tandem with the shader? Thanks for the help.
leave the rolling scan option off. it’s unrelated.