Let's talk about Latency (and Game Mode)

@Hari-82 @GemaH I might be mistaken, but I believe it adds a whole frame of input lag, yes.

Not on my setup though. I still get the same amount of lag frames judging from the pause - frame advance method.

@GemaH That method is only for counting frames of input lag internal to the game engine. I.e. the lag that a game has internally that was observable even on the original console. For example, Super Mario Bros has one frame of input lag even on the NES.

It does not measure any of the input lag from the RetroArch graphical pipeline or graphics drivers.

1 Like

Is that true though? Because some people are getting different results with this method. For instance, i only get 1 frame of lag with Genesis Sonic games but other people have reported 2 frames, using the same core.

Also, another reason i don’t think that’s true is because the Saturn core has a lot of frames of lag in all games, like there’s a 3 frames of lag base for all games at least and you go up from there. I don’t think that’s true for the real console.

I would test, but I can’t right now.

But you can do a simple test by setting Hard GPU Sync Frames to a big value, like 5 frames, and then measure the number of frames of lag in, say, Super Mario Bros. That will give us the answer for sure.

From what I understand the frame advance method (with the game paused) can determine the “internal” delay and is useful to set run-ahead (or preemptive frames) to a proper value but for the other settings is always better to do a button-push to animation test.

My mobile (and my old camera) can only record 60fps, if somwone have the means (and time) would be nice to have a proper test recording @120 or more fps…

or maybe a dev could give us a more technical explanation :nerd_face:

Hard GPU Sync frames only goes up to 3.

I tested all values from 0 to 3 and i’m still getting a single frame of lag in Sonic using GenesisPlusGX.

Whatever these GPU frames do, it must be very subtle.

That is to be expected. That setting would never add internal game input lag. To get the actual latency and find out if “Hard GPU Sync frames” ads frames of lag, you need to do a button-to-display measurement using a high-speed camera, like the one modern phones have.

3 frames of lag might be too small to be noticed by the naked eye.

2 Likes

If possible, you should set your video driver to vulkan. It simplifies things and default settings should be as good as opengl and hard gpu sync, being lighter on your system.

1 Like

I second this. Less hassle.

No need for cameras to get a clear indicator of the presentation lag. Simply show the OS mouse cursor with mouse ungrab, enable the lightgun pointer in any core that has one, and keep moving the cursor with different speeds.

Just like here: https://www.vsynctester.com/

Frame stepping has nothing to do with this lag, but only the internal core input reading delay.

Nope. Saturn Out Run menu clearly reacts on the same second frame as Sonic does in Megadrive. Except with Picodrive which has one extra frame compared to GenPlusGX.

And the default Vulkan swapchain setting is 3, which is not as responsive as 2, like other drivers at their minimum respective option.

I’m not sure I follow what you mean. How could the mouse cursor or lightgun pointer give us the full input lag measurement including the graphical stack?

The https://www.vsynctester.com/ website only measures the difference between the hardware cursor and a Javascript software cursor inside the browser. The hardware cursor itself already has lag. And I don’t understand how this is related to measuring RetroArch’s input lag.

And how can you measure the exact time a button was pressed without a camera recording your finger movement?

You’re right! I forgot to mention this. For minimal lag you should set the swapchain value to 2, as long as your PC can keep up with it, that’s the way to go for lowest graphical stack lag.

1 Like

“Hardware” cursor difference to “software” cursor is the lag. You can clearly feel it without trying to measure it.

I agree that there is lag between the hardware and software cursors, but that is completely unrelated to RetroArch input lag measurement.

No, it is totally related.

Maybe not though if you have an Nvidia card based off of pneumatic’s testing numbers. It seems like you could be adding almost another frame of lag (or more) switching to Vulkan on Nvidia for some reason. On AMD it seems great though.

Has anyone ran their own tests to compare with pneumatic’s? I have always stuck to gl with hard GPU sync (I have a 2070 Super), but would like to switch to Vulkan if I can get the same or better input lag.

Sorry, I can’t actually comment on that. I’ve only had an Nvidia card about a decade ago, and barely played RetroArch with that machine (not that I’d remember much anyway). I’m completely clueless about that matter. However, regarding Intel/AMD, I can say that vulkan is great with RetroArch, I’ve done many personal tests with the duo. Nonetheless, I’d like to add that a few cores might have have trouble with vulkan, like Flycast (texture issues) or PPSSPP (crashes).

The RetroArch program (or any other software) has no way of measuring latency outside of it’s execution scope.

For example, latency induced by:

  • controller/mouse/keyboard micro-controller processing
  • USB polling rate
  • operating system input stack and driver processing
  • (RetroArch is sandwiched here)
  • graphical stack and driver processing
  • monitor/TV micro-controller processing

The hardware latency is the sum of all of those, so I completely disagree that it’s possible to measure hardware input lag just using software.

1 Like

Back in April I recorded 60 FPS video of MiSTer FPGA’s Saturn core (off a CRT screen, not a capture card) and compared it to the number of frames I have to skip on Beetle Saturn. The results generally lined up, even taking into consideration that camera-based input lag measurements can be off by 1 frame.

Croc: 7-11 frames (FPGA camera) vs 7-10 (Beetle Saturn) Sonic 1 (Sonic Jam): 3-4 frames (FPGA) vs 4 (Beetle Saturn)

I’ve never written games for these consoles but it makes sense to me that the 3D consoles have more input lag than their predecessors since they have to fill up a framebuffer before any image can be displayed. The Saturn in particular has to synchronize work across 2 CPUs and 2 video chips, so I’m not surprised it seems to need at least 3 frames total to render anything.

Any games that run at less than 60 FPS (i.e. most of the 3D ones) are also going to take an input lag hit, but that’s true across all consoles. Croc seems to update its physics at 15 FPS on Saturn which is why its lag numbers are particularly bad, but you can find similarly bad games on other consoles. Twisted Metal for PSX seems to run at 20 FPS and has 6-8 frames of input lag as a result.

6 Likes

Loading up Quake II (PS1) and overclocking the CPU between 200 ~ 300 or even 400%, it’s an easy way to see this game in particular improving drastically its internal input lag. I find it interesting how the game and its engine affects playability, it’s probably for specific cases, but interesting nonetheless.

1 Like