An input lag investigation

This is on Linux, not Windows.

It’s vsync OFF. And yes, it’s 60FPS. I can easily tell when something is 30FPS. Every frame is there.

And no, Vulkan for me (even on Windows where I dual boot to play modern games) is excellent. I’m on a 980 Ti with a native g-sync monitor. Never had any issues with Vulkan.

I suspect this an nvidia vs amd thing. All the advice on blurbusters about g-sync is spot on for me. Vsync does exactly what the articles claim it does. The takeaway here is that those articles are called “G-SYNC 101”, not “Freesync 101” :wink:

Not sure if trolling. The data and tests that are provided by the articles are numerous. Every combination of vsync on/off with various frame caps were tested. Thousands of individual tests were performed. It took months to perform all the tests to finish those articles. There’s 15 of them, if you haven’t noticed.

I’m having the game running at 30fps in win10, nvidia, vulkan, RA vsync off, indeed. With glcore it seems to run fine at 60 without tearing.

Before I had tearing and stutters like RealNC is seeing, but it was with older drivers on win7.

So ok, everything is possible, no need to be aggressive.

1 Like

I did a quick test again as I’m on win10 now, new Ryzen PC.
240fps recording (1 frame = 4.167ms), bottom of the monitor (lg32gk850g).

MAME (1 frame reaction, unibios boot settings):

vulkan gsync 5 5 6 5 6 7 5 4 7 8

5.8 = 24 ms

FCEUMM (1 frame reaction, runahead 1 in smb):

gsync vulkan 7 7 5 4 5 8 5 5 7 4

5.7 = 23.75 ms

gsync vulkan no shader 6 3 8 5 6 5 7 4 5 6

5.5 = 23 ms

So MAME is good since the fix Energy-t did.
I’m a bit faster than my older tests on win7, same range but more lucky low values it seems.
My new computer is acting the same on that gsync monitor + nvidia GPU, good to know.

1 Like

I’m on Windows 10 and with a gtx 960 and i get the same behaviour: with vulkan if I disable v-sync framerate goes to 30 (even if it shows 60).

Also, I do not have a VRR monitor and I’m not sure what’s the actual pacing but: if I use vulkan + vsync + sync to exact frame rate RA shows the correct FPSs and have no tearing, if I do it with d3d11 of glcore it does tear…

Stop saying that! Seriously. I know how to tell framerates apart. I’ve been doing it for quite a while now. Here’s a video:

http://83.212.109.87/~realnc/vid/retroarch_vulkan_vsync_off.mp4

In the video, the animation during the room transition looks perfect in the video. However, it doesn’t look like that in real-time. There is jitter in the animation that is not visible at all in the video.

But this video proves that I use vulkan on Linux with vsync OFF (when in RA menu you can see the FPS going above refresh rate, which is 144Hz in my case) and there is no 30FPS cap or anything.

What does VRR have to do with frame pacing? VRR does not fix frame pacing issues. It has absolutely nothing at all whatsoever to do with frame pacing. VRR will just update the display (by ending the vblank) when the game presents a new frame. If a game has frame pacing issues, VRR is not going to help you.

And in some rare cases, gsync ON with vsync OFF gives me temporary tearing near the bottom of the screen. This is with a 120FPS cap on 144Hz. One game that does this for me is Prey 2016. Another one is the inventory and map screens in Witcher 3. When closing the inventory or map screen, during the transition from the menu to the main game, there is a split second where tearing is visible after the game engine stalls the framerate for about half a second. When LFC tries to recover from this 0FPS situation, you can see a few frames that have tearing near the bottom of the screen.

So please come down from that high horse of yours and give some consideration to other people’s input and experience.

1 Like

tl;dr: GPU drivers are unpredictable. YMMV

Not on Linux. If I force vsync off, FPS goes above refresh rate, as shown in the video.

I don’t know how to demonstrate this. I recorded a video, and the jitter (not stutter, just jitter, like when viewing a 24FPS movie on a 60Hz display) is not visible in the video. I don’t have a 240FPS phone camera. I have a Nokia 5.3, and it can only record 30FPS.

So you’ll have to take my word on this. I am not lying.

Yeah. There is no other info available, so I just referred to the documented behavior. If that changed now, great. It wasn’t mentioned anywhere. All I can say is that I found one case, where turning vsync OFF results in jitter. That’s all. Take it for what it is.

Yes. And what “multiple settings?” The only setting that needs changing is vsync. I toggle it on and off.

Right. I guess all the people out there who have mentioned this occurring on their machines are imaginary:

https://forums.blurbusters.com/viewtopic.php?t=7128

https://www.nvidia.com/en-us/geforce/forums/game-ready-drivers/13/206258/gsync-on-vsync-off-screen-tearing/

https://www.nvidia.com/en-us/geforce/forums/game-ready-drivers/13/252015/understanding-how-g-sync-and-v-sync-work-together/1840514/

And there’s plenty of google hits for “g-sync vsync off tearing.” It’s a known thing. It has been known forever. I’ve seen it happen in the past, still happens today. And I still see people to this day on the blur busters forum (I’m part of the admin team there and thus see a lot of posts) reporting that without vsync, gsync can still tear. And always the same report: “temporary tearing near the bottom of the screen.”

I don’t have any issues with frame pacing or stutters right now with RetroArch but i will believe anyone who does.

Uneven frame pacing and microstuttering is an issue that has always bothered me in the past, in many other different situations. And it’s one of those issues that are very hard to fix sometimes, very hard to figure out if it’s your system’s fault or the affected program’s (to know if you should even bother fixing it) and, most importantly, it’s very hard to describe to others or prove.

So please, don’t be condescending to people trying to explain such issues. Some are more sensitive than others to these things, i know most people don’t even notice microstutters (or care about them) but as someone who has stopped playing some games altogether just because they repeated 1 frame every few seconds, i can sympathize and hope his issue is fixed.

1 Like

I have a Gsync monitor and I tried to turn V-sync off after reading here, it becomes a jittering mess. It may not be the case for anyone else, just sharing it.

Sorry, Vulkan, no I don’t have any footage, I just read here and did some tests, and after I noticed jitter (right after turning V-sync off), and then I turned V-sync on again.

I see, does this happens only with RetroArch? (FPS cap)

I’ve recently built a NES/SNES low-latency build out of a Raspberry Pi 4. I wanted to see what I could achieve with this hardware together with Lakka. I am now running Lakka 3.6 and have finished the setup and run some tests. The setup is:

  • Raspberry Pi 4 4GB (CPU overclocked to 1.9 GHz, otherwise stock)
  • Flirc case
  • Lakka 3.6
  • Rasphnet USB to Wii controller adapter + Nintendo SNES Classic controller (with 1 ms USB polling and 1 ms adapter polling)

The system/Lakka settings are:

  • /boot/config.txt: arm_freq=1900
  • Forced global 1000 Hz polling for USB game controllers via kernel command line (/boot/cmdline.txt): usbhid.jspoll=1
  • Set emulator Nestopia for NES
  • Set emulator snes9x2010 for SNES
  • RetroArch: Audio driver = alsa
  • RetroArch: Threaded video = off
  • RetroArch: Max swapchain images = 2
  • RetroArch: Frame delay = 2
  • RetroArch: Run-ahead 1 frame (and use second instance enabled)
  • RetroArch: Enabled zfast-crt shader

With these settings, I ran input lag tests on Super Mario World and Mega Man 2. I used my iPhone 12 and recorded the screen and me pressing the jump button at 240 FPS. I then used the app “Is It Snappy?” to carefully analyze the result, with 20 samples for each of the two tested games. Before testing, I calculated what the input lag (in 60 FPS frames) would have been on an original console with a CRT, taking into account the character’s placement on the screen. The expected input lag on a real console+CRT:

Super Mario World:

  • Avg: 3.2 (i.e. ~53 ms)
  • Min: 2.7
  • Max: 3.7

Mega Man 2:

  • Avg: 2.1 (i.e. ~35 ms)
  • Min: 1.6
  • Max: 2.6

For this testing, I used my trusty old 22 inch Samsung LCD TV. From previous tests and comparisons I’ve made with other displays, I’ve determined that this display’s input lag when running at 1080p native resolution is approximately 1.05 frames.

The results of my Lakka tests, after subtracting the known input lag of the display (1.05 frames) are:

Super Mario World:

  • Avg: 3.2
  • Min: 2.7
  • Max: 3.7

Mega Man 2:

  • Avg: 2.1
  • Min: 1.7
  • Max: 2.7

As you can see, these results are within measurement tolerances from the “real deal”. This setup, with a lowly Pi 4, performs like the original console and even manages to have the zfast-crt shader active. The one compromise I had to make was to use snes9x2010 instead of snes9x, but I believe snes9x2010 is actually a quite fine core. I can’t say I’ve tried that many games, but two of the heaviest SNES games, SMW2 and Star Fox, seem to work fine, with no stuttering, audio issues, etc.

Using this with my LG OLED65CX, the average input lag will be approximately 5 ms (0.3 frames) worse than the original NES/SNES running on a CRT. That’s quite okay, right? :grin:

Big thanks to the RetroArch and Lakka developers for making this possible. I bow in respect. :pray:

9 Likes

Wow, great results, and solid methodology as always. Thanks for reporting!

2 Likes

If I compare to my chain of lag on my PC, and just using gl hard sync 0 or vulkan, I’d say it could be faster by 10ms.
(even 20ms but that’s with vrr or frame delay ~10)

What I mean is simply that I estimated what the input lag from button to pixel would be on the original consoles with a CRT. For example, for Super Mario World:

  • Average time from button press until game sampling the input: 0.5 frames
  • “Internal lag”, i.e. number of frames where the game does not respond to input: 2
  • Scan-out to display (top to bottom, 16,67 ms in total) until reaching the character’s position: 0.7

So, total button to pixel lag on a real SNES with CRT, given the exact same game scene would be ~3.2 frames or ~53 ms.

Well, it only works that way if you’re at parity with the original console before applying run-ahead. One problem with modern frame buffered hardware, at least if you don’t have a VRR display, is that the frame is rendered first and then scanned out to the display. This gives an inherent 1 frame input lag deficit.

In my case, with the Raspberry Pi 4, I start by getting rid of excessive buffering and threading related latency (max swapchain images = 2 and threaded video off). I minimize the USB polling latency by maximizing the polling frequency. I then further reduce the input lag by 2 ms using frame delay. This compensates roughly for the remaining 1-2 ms input lag caused by my controller + adapter. At this point, the system performs 1 frame slower than “the real deal”. I then add 1 frame of run-ahead to reach my final goal of parity with real hardware.

Yep, when I talk frames it’s native display frames, so ~60.09 on SNES and ~60.0 on the Raspberry. I’ve clarified slightly by editing my previous post.

Yes, 53 ms including the Samsung LCD TV. If you’re at 14 ms, you’re significantly faster than the real console ever was. That’s certainly possible nowadays when using run-ahead, but I personally aim to stay as close to the original consoles as possible.

It’s worth mentioning that where and what you test may influence the results. For example, Mega Man 2 responds on the next frame when you’re in the menus, while it responds on the second frame when you’re in game. My tests were both in-game, jumping with the characters.

No issue here, so this is definitely an issue about your os, drivers, or setup. Maybe you should avoid the smug act.

I have the same issue but with an nvidia card (I think @burner is on amd), i’ve read some other people having the same behaviour. I will also say that for me is not a “real” problem because I usually keep vsync on…

Yes, that’s not the first time i hear about people having those 30fps issues, however that’s obviously a os/drivers/setup problem since some other people aren’t affected.