Definitive guide to reducing input lag to the minimum possible

Does such a thing exist? I’ve looked all over and have only been able to piece together various bits of information gathered from different sources.

What I’ve learned so far:

-use a CRT monitor, they have no lag -if you use an LCD, pick a low lag gaming display -disable all image processing “enhancements” -make sure the display is set to its native resolution -use “windowed fullscreen” if available -use windows 7 64 bit instead of Linux -use hard GPU sync, set to 0 or 1 -set vsync frames to 1 -set “frame delay” to the highest value that doesn’t affect performance -don’t use shaders (except maybe Hyllian) -don’t use bilinear filter/video smooth

Is this it? Is there a definitive user’s guide for all this somewhere?

Learned a lot from this blog post: http://filthypants.blogspot.com/2015/06/latency-testing.html

Interesting stuff.

There’s no definitive guide that I know of, and most of the existing data is based on subjectivity and voodoo.

  • Windowed fullscreen is actually worse than exclusive fullscreen (as demonstrated in my post there).
  • Shaders aren’t necessarily out, but they can affect things negatively. It’s probably best to try out a given implementation with the 240p test suite’s manual lag test to get some good before/after comparisons.
  • Win7 has better latency than linux in testing from GroovyMAME’s forums but I haven’t had an opportunity to independently verify that with RetroArch. What I can say is that Linux via KMS “feels” much better than Linux via X.
  • Frame delay and hard gpu sync 0 do pretty much the same thing and there doesn’t seem to be any point in stacking them.
  • I don’t think bilinear filtering has any effect, as GPUs have dedicated hardware that handles it for “free.”

Desktop compositing (e.g., Aero in Win7) is very bad. Any kind of stuff in the GPU control panels is also bad and everything should be OFF/Disabled.

Thanks for the info. The delay’s I’m getting are so low I can’t tell myself but something felt different for a while compared to the original games. Maybe I was just used to the crt hooked to my console, as I have an LCD monitor :P. But it’s so minimal it doesn’t matter in the slightest (unless you’d have to press frame perfect timings in something). The audio however is delayed for me more than he original but it doesn’t matter much, unless I’m trying to play a rhythm game. This audio delay threw me off at first, thinking the input was just as delayed.

[QUOTE=hunterk;26430]There’s no definitive guide that I know of, and most of the existing data is based on subjectivity and voodoo.

  • Windowed fullscreen is actually worse than exclusive fullscreen (as demonstrated in my post there).
  • Shaders aren’t necessarily out, but they can affect things negatively. It’s probably best to try out a given implementation with the 240p test suite’s manual lag test to get some good before/after comparisons.
  • Win7 has better latency than linux in testing from GroovyMAME’s forums but I haven’t had an opportunity to independently verify that with RetroArch. What I can say is that Linux via KMS “feels” much better than Linux via X.
  • Frame delay and hard gpu sync 0 do pretty much the same thing and there doesn’t seem to be any point in stacking them.
  • I don’t think bilinear filtering has any effect, as GPUs have dedicated hardware that handles it for “free.”

Desktop compositing (e.g., Aero in Win7) is very bad. Any kind of stuff in the GPU control panels is also bad and everything should be OFF/Disabled.[/QUOTE]

Thanks for the info, good stuff. Regadarding frame delay and hard GPU sync, would it make sense for lower end systems to enable frame delay and disable GPU sync?

Yeah, if your CPU can’t handle hard GPU sync, frame delay is a good alternative, it just has to be tuned to each game rather than being automatic.

Is there any advantage to frame delay over using hard GPU sync = 1?

How does one go about determining the right per-game setting for frame delay?

I’m also curious to see more of the crt-shaders tested for latency. One observation is that CRT-Hyllian is also the only CRT shader besides CRT-Caligari which will run on the Raspberry Pi 2. I suspect that the reason CRT-Hyllian doesn’t cause input lag is because it is doing less heavy math.

based on your blog post, I snagged a refurbished Windows 7 64-bit workstation and loaded up Retroarch and configured the settings accordingly. I could immediately tell the difference in input lag vs using Linux on the Raspberry Pi 2. Do you know what the actual measured difference is or what the minimum input lag is when using Linux kernel?

Thanks!

Nope, I haven’t gotten to do any testing in linux at all. Subjectively, I think linux+KMS feels as good or better than my experience with Windows, but the GroovyMAME guys insist this isn’t the case, so it could just be placebo on my part. Running in linux+X/desktop is indeed far from optimal, though you can use hard gpu sync or frame delay there, just as in Windows. GroovyMAME guys say they’ve never gotten below 4 frames of latency with linux+kms but that they can get next-frame latency in Win64. I haven’t had the opportunity to reproduce their settings and results but they seem pretty confident about it.

I think in the case of crt-hyllian and xbr, it was just functioning similar to frame delay on my system (that’s the only plausible explanation I can come up with for why it would be lower latency than no shader at all), so, depending on how powerful your GPU is, YMMV. Have you tried crt-easymode on RPi2? I think it will probably run there, too.

To determine the right frame delay, just keep turning it up until you get crackles/stutters and then dial it back one step (kinda like how you overclock a PC, if you have any experience with that). I’m not sure how hard gpu sync of 1 frame compares with max frame delay.

[QUOTE=hunterk;27105]Nope, I haven’t gotten to do any testing in linux at all. Subjectively, I think linux+KMS feels as good or better than my experience with Windows, but the GroovyMAME guys insist this isn’t the case, so it could just be placebo on my part. Running in linux+X/desktop is indeed far from optimal, though you can use hard gpu sync or frame delay there, just as in Windows. GroovyMAME guys say they’ve never gotten below 4 frames of latency with linux+kms but that they can get next-frame latency in Win64. I haven’t had the opportunity to reproduce their settings and results but they seem pretty confident about it.

I think in the case of crt-hyllian and xbr, it was just functioning similar to frame delay on my system (that’s the only plausible explanation I can come up with for why it would be lower latency than no shader at all), so, depending on how powerful your GPU is, YMMV. Have you tried crt-easymode on RPi2? I think it will probably run there, too.

To determine the right frame delay, just keep turning it up until you get crackles/stutters and then dial it back one step (kinda like how you overclock a PC, if you have any experience with that). I’m not sure how hard gpu sync of 1 frame compares with max frame delay.[/QUOTE]

Thanks for the reply-

My own subjective experience using multiple PCs using Windows 7 64 bit (connected to the same display) leads me to believe that Windows 7 CAN - given the right hardware - result in less input lag than using Linux. However, I can’t really confirm this because I’ve only used one Linux machine, and a very under powered one at that (the Raspberry Pi 2). I don’t think Windows 7 64 bit will automatically result in less input lag than Linux by itself; I think a lot of it depends on hardware.

Crt-easymode is just slightly too much for the Pi - it will run but it causes a slight drop in framerate, and (like all shaders on the Pi) only at 720p. It will run smooth if you disable the Lanczos (sp?) filter, but then it doesn’t look nearly as good.

This really helped me out. Thanks!

I believe when I read some input lag tests somewhere that Groovymame did have less than kms mode. Don’t know how legitemate the tests were ofcourse, and don’t know if the difference is noticable.

Damn, I got the first BSOD I’ve ever seen on this PC while editing this post o.o

The best result is , hard gpu sync = 0 + frame delay to max possible . I have a best reduction of input lag with add this two option together . gpu hard sync alone is good but with frame delay is better . I have long played only with “hard gpu sync” because I did not know “frame delay” , so I very well saw the difference when I added frame delay … I am extremely sensitive with input lag, I feel the difference between 2 frame because i’m player fighting game and player shmup : https://www.youtube.com/watch?v=nK20Gr6mUPc

I have mushihimesama steam version and mame , and combining this two setting it’s hard to tell the difference between the two versions (just a 1 frame because frame delay up to 10 is to difficult with this game emulated) on the same TV screen (sony kdl-42w705b, input lag tv:13,5ms)

With frame delay at 10, it’s the same input lag to steem version … but need intel 6ghz haha

1 Like

But, Hard GPU SYnc has to be set to “1”, not “0”, to be enabled, right ?

for activate “hard gpu sync”, just turn setting “on” “hard gpu sync frames” is the precision settings, 0 is the best result but demand power cpu

1 Like

Thanks, I will try that !