An input lag investigation


@Brunnis would you mind editing your last post in order to clarify a few things about the data?

  1. what game & emulator core are those frame numbers for (yoshi’s island and snes9x2010 i presume?)
  2. does VC4 refer to the (still) experimental mesa opengl driver or the kernel drm driver?

  1. Those numbers are just an example of the relative differences, but, yes, they are typical values that I saw when testing Yoshi’s Island with Snes9x2010.
  2. The DRM driver.



-Yes, “Dispmanx + GLES” = “GLES using the old BCM driver” (Sadly still default on Raspbian… I HATE that fact).

-With the latest RetroArch GIT, I have forced the driver to wait for vsync after issuing a flip. I believe in your test results, of course, but as things are now, there should NOT be any more input lag frames on “Dispmanx + GLES” than in “plain Dispmanx”. All with max_swapchain=2, of course.


Are there any new recommendations for minimal Lag on a Pi2 or Pi3 with RetroPie? Or are these still valid:

  • video_driver = “dispmanx”
  • video_vsync = true
  • video_threaded = false
  • video_frame_delay = 0


Those are still perfectly valid, but using

  • video_driver = “gl”
  • video_max_swapchain_images = 2

instead of

  • video_driver = “dispmanx”

you should be getting the same amount of lag and you can now use GL shaders.


Hi everyone ! The “Super NT” announcement made me think about hassling you again with this dreaded input lag subject :wink: Sorry about that :slight_smile:

I created a new bounty suggestion for a Lakka build with Nvidia proprietary drivers + a proper use of the Vulkan X-free context (or Nvidia KMS).

Did anyone here measure the input lag difference between some common Linux setups like :slight_smile:-

  • X + OpenGL
  • X + Vulkan
  • KMS
  • Vulkan X-free context

My experiments : under both Arch & Ubuntu, with a current Nvidia gpu, Vulkan “almost works”. It does work under X but I’m not sure if I’m supposed to benefit from a reduced lag under X (is double buffer applied ?). Vulkan X-free context ALMOST works. Actually, I can start retroarch from the console, it works. Well, if I specify a game from the command line the game works. If I don’t, the menu works, but I get an error if I try to start a game from there (resource not free or something). The Nvidia KMS mode segfaults.

Well, all that is promising :slight_smile: Sorry for annoying you guys !


Use anti-micro go to the option and change polling rate at 1ms

you can test with

to do this, emulate a key of your controller to the keyboard, then press the key repeatedly on the controller and watch the result, with a gamepad usb pc the minimum must be 4ms between the different pressures, exemple 44ms,48ms

If you are under Windows 7 you can also disable desktop composition , you can test with human benshmark reaction : , it works very well for a lot of games or MAME with directX, against retroarch in open gL is not concerned by this impact


Wasabi: hi ! Thanks for the reply ! So it would mean emulating the keyboard can turn out to be faster than using the native evdev gamepad support?


After several tries I realized I had to release the button very quickly. I can now get a 1 ms value. Does it mean the default poll rate in Linux is always 10 ms in any app / hw combination using evdev ? Will my setting it apply to any app using the gamepad even when not emulating the keyboard ?

Thanks !

Edit : well, I can get a 1 ms value even with a 10 ms poll rate so I really wonder if it makes a difference :slight_smile:


I don’t know anything about linux, maybe linux queries every 1 ms, under windows I can’t go below 4ms for my gamepad in PS4 mode… (i use a fighting command) or PC… in PS3 mode I can’t go below 8ms (there’s a selector switch that allows to choose the mode)

You don’t need to emulate the keyboard, the emulation of the keyboard is just to see if everything works. But yes the keyboard (and mouse) can have a greater speed, the software of the mouse manufacturers allow to lower the polling rate. the answer is that it really depends on the material used.

The application only works if the gamepad is detected, otherwise it doesn’t work. It must be turned in the background.


Is the audio latency setting basically ignored if you have vsync on? If you set it to something ridiculous like 3200 and have vsync off but audio sync on, the audio will be very far behind but as soon as you turn vsync on the audio and video will be in sync. So it seems like if vsync is on it automatically lowers your audio latency to sync up better.

It will still cause popping if you turn your audio latency to low so it seems like it might be best to leave audio latency at 64 and not bother lowing it IF you always have vsync on since it’ll lower it for you if it needs to. I could be completely wrong about all of this though hah.


no, at least i don’t think so. take this with a grain of salt, signal processing isn’t my thing so i’m not qualified to speak on the subject, but to my knowledge retroarch is quite sophisticated in that if you have both video and audio sync enabled it will dynamically adjust the audio sampling rate in an attempt to keep the buffer from either under or over running. how well this works is subject to some constraints such as a few configurable variables like audio_rate_control_delta, audio_max_timing_skew, and (i believe) audio_latency

personally i care far more about smooth video than audio so i just leave audio_sync disabled, sorry Themaister (i believe this part of the project was/is his baby!)


That all makes sense. Vsync adjusts the audios latency even when audio sync is off which kinda confuses me tho.


I thought using GL will add one frame of input lag compared to the dispmanx driver? And video_max_swapchain_images = 2 will limit framerate a bit.

I am still using the default GPU driver. Or should I also switch to the experimental GPU driver?


vanfanel is saying that with the current version of retroarch in git this should no longer be the case, provided video_max_swapchain_images is also set to 2

vanfanel has stated a few posts above that the performance should be about the same as dispmanx currently is. plus with GL you’ll also get access to shaders and i believe it’s going to be the optimal path moving forward once the mesa vc4 driver gets mainlined

again i’ll defer to vanfanel, but probably not. once it gets mainlined into a popular distro like raspbian it will start getting tested against more aggressively than it is now, if you don’t have experience troubleshooting problems in linux (which usually involves checking logs, editing config files, and being comfortable navigating in and using basic shell commands) i’d guess it’s probably best to wait.

last i heard the reason it hasn’t been mainlined yet is due to the proprietary driver having more mature display hardware support


I tested the GL option now with video_max_swapchain_images = 2 but it feels, no possiblility to measure at the moment, that SMW is lagging a bit more.

Edit: I made a mistake. I tested dispmanx without activating video_max_swapchain_images = 2. Have to test GL against dispmanx without using video_max_swapchain_images.


I tried this today with X3 and I feel like there is a bit more responsiveness and I can hit my dashes more consistently compared to higan. 8 seems to be the best “safe” value I can use for fast cores on my 2500k without getting stutters/crackles.


Ya 8 is what I use too snes/nes/genesis. I doubt anyone but speedrunners would notice much of a difference but more responsiveness is always better.


Thanks for going through all this testing and putting up your settings. I recently acquired a few year old laptop with an integrated Intel 4000 and Radeon 7670m and I’m running RetroArch on windows 10, output to a PVM.

Unfortunately OpenGL in Windows 10 is useless for my setup but I’ve switched over to d3d with very good results in exclusive fullscreen mode. OpenGL performance is really bad in exclusive fullscreen and won’t sync properly in borderless window on the PVM. I just get a dropped frame every 5 seconds or so as the forced desktop composition must not match with what OpenGL is doing in retroarch.

In d3d mode I’ve been I have smooth scrolling with smooth scrolling and very low input lag. I’m guessing the lag is about 1 frame. I’m using the manual lag test on using the 240p test suite on the Genesis core. I’ve actually been able to get a score as low as 0.1 frames of lag on that test but I think i think it must be at least 1 frame unless the frame delay setting can actually get you to less than 1 frame?

Best test results

Anyways, I know d3d isn’t ideal due to the lack of graphical options but I wanted to suggest it for anyone outputting to a 15khz CRT in windows. I couldn’t be happier in terms of input lag.


Im almost near you. Im using a very old laptop who has a core2duo [email protected] and a GPU Nvidia 8400 series. This is my results in the lastest higan core



pd: I forget to say that im using a dualshock 3 controller through a BT adapter.

It’s on Archlinux in KMS/DRM mode.