An input lag investigation

Yeah, 60 fps was maintained solid the whole time. I, too, assumed it could never push the latency outside of a single frame without reducing the framerate but that’s not what my results indicated. I was using an AMD card with an Eyefinity setup, so maybe there’s something weird at play with that.

Thanks for doing the additional check. I’m honestly relieved that my results weren’t universally applicable, since I like shaders so much :stuck_out_tongue:

1 Like

Another data point regarding shaders and input lag:

I activated the crt-aperture GLSL shader on my Pentium J4205 system running Ubuntu 16.10 in DRM/KMS mode. I’m using the built-in Intel graphics. I measured input lag with my usual test routine and just like my tests of the other shaders in Windows, input lag performance remained unchanged after activating the shader.

3 Likes

Thank you for your investigation on shaders and raw input lags ! Do you plan to test latency with the new Wasapi Windows audio driver too ?

His testing setup is only for video latency.

I assume anyone could test audio latency just by putting a microphone and a gamepad next to their speaker and then recording a button press. Take that audio file into audacity or whatever and then look at the timing difference between the button click and the sound effect firing.

1 Like

It seems like Mednafen Saturn has some input lag problems as well.

I have a question to you brunnis, I have actually bought myself a J4205 (in an asrock beebox) as a dedicated retro arch box but I cannot run games reasonably without having threaded video enabled. Is that a limitation of the very small form factor of the beebox thingy or am I simply doing something wrong? I am using Ubuntu 16.10. and run Retroarch 1.6.1 in KMS Mode, video_swap_chain is set to 2 and frame delay to 8. Are you using threaded video mode yourself and were your input lags tests run with it enabled? If so, what latency change does the threaded mode cause - Thanks for your help!

Unfortunately, I have not tested the difference between having threaded video disabled/enabled. I always have it disabled. I only run NES and SNES emulation currently and without using shaders and using a 1080p display, I can use max_swapchain_images=2 and a frame delay of ~6. Using a higher resolution display, such as a 4K TV, will increase the system load noticeably, although I haven’t quantified it. If you’re using a 4K screen, I would suggest actually running RetroArch in 1920x1080 instead.

There’s one more thing: A while back I updated my Ubuntu 16.10 installation. This updated the Linux kernel (don’t remember the version) and Intel GPU driver. After this, I noticed significantly worse GPU performance and I had to downgrade to restore performance. So, if everything else fails, try a completely fresh Ubuntu 16.10 installation with no updates.

2 Likes

Thanks for your answer! I now found out that I simply had to set frame delay down to 4 and I was able to not use video_threading! :slight_smile:

Frame delay should be the last thing you tweak. Get your other settings the way you want, and then dial in the amount of frame delay that your system can handle. This frame delay amount will vary for every emulator, in my experience.

In previous tests you found that shaders can introduce significant input lag. In your more recent tests you found it produced no input lag. What do you think changed in that time to account for the difference? Thanks :).

This was actually the first time I’ve tested how shaders affect input lag. Previous testing was performed by hunterk and given the pretty long time that passed between his tests and mine and the probably significant differences in hardware and software setup, it’s pretty much impossible to say.

Someone will probably have to reproduce this behavior again and then we can start comparing test setups to try to pinpoint the cause.

1 Like

Hi,

Is there a final answer on Input Lag GL vs Vulkan on Windows 10 ? Is it hardware dependant (ATI vs Nvidia) ?

Thank you

Bubs

Nope, no final answer yet. New tests should be done on both AMD and Nvidia hardware with the latest drivers. Unfortunately, I now have very little time for such tests and I no longer have any (modern/supported) AMD GPUs.

1 Like

Brunnis for 240p output on rpi3, do you recommend dispmanx and some frame delay to get input lag as low as possible? What about USB polling rate, could that effect things as well?

I haven’t personally used 240p output, but the same setting recommendations should apply, i.e. DispManX for lower input lag. Frame delay will provide an improvement as well, but may not be feasible if you run any kind of demanding workload. Higher USB polling rate will improve the input lag, if you can get it to work reliably (I haven’t tested it myself).

Do you know how to change the usb polling? My friend is trying to do it now. Is it something in Retroarch itself?

FYI this is now controllable with the HW bilinear filtering option (video_smooth). can be removed in original post :slight_smile:

I saw it posted somewhere that if you are using hard gpu sync, there’s no reason to touch the frame delay settting. Is that true or is there still a use for frame delay on conjunction with hard gpu sync?

Thanks! I see hunterk made an edit about that.

Those settings are not related to each other. Frame delay will still work the same, whether hard GPU sync is used or not.

[quote=“Bahn_Yuki, post:478, topic:4407, full:true”] Do you know how to change the usb polling? My friend is trying to do it now. Is it something in Retroarch itself?[/quote] No, sorry. I haven’t dabbled with that yet.

@Brunnis: I think I have found a way to cut a frame of input lag with GLES on the Raspberry Pi using the Broadcom graphics stack (dispmanx+GLES). It comes from this idea I had and talk with popcornmix of Pi kernel fame:

I would like you to test this new idea, I have a RetroArch repository with it’s implementation here:

I have also uploaded a test binary (for Pi3) here, in case you don’t go into the building process:

So, Brunnis, can you apply your magical LED & camera equipment and test vs plain dispmanx, etc? Simply use the GL driver: should have the same ammount of lag than the plain dispmanx driver now. Thanks!

2 Likes