An input lag investigation

That’s great, thanks.

@Brunnis I noticed something odd in the px68k core when doing the single step frame advance method. If you open the px68k emulator menu by pressing “F12” (it opens the internal emulator menu where you can set FDD1 and FDD2 etc, see image below) and you do pause (“p”) and then push down or up and single frame advance (“k”), then within the same frame as you do the frame advance the cursor in the menu will move.

I have not encountered this behaviour with any other core (there as far as I know quickest response will be on second frame advance / pressing “k” twice). **

Is this normal / expected behaviour or is there something odd going on? Does it possibly have any implication for how we’ve been counting the delay for nes and snes?

0d7a63fd31ffda6c826f4139baef3384

** Note that once a game is running the x68k core shows “normal” latencies of two or three times pressing “k” before movement is shown. So it may be something odd with how the frame stepping is working with the internal menu?

That’s completely normal. It simply means the menu has next frame response. The same can be seen on NES/SNES emulators if you run content that responds on the next frame. For example, try the menu in Mega Man 2. While this game takes two frames to respond during gameplay, it has single frame response in the menu.

Did you try it?

I’m not getting same results.

px68k in “F12” menu shown in my previous post:

[down+k] = cursor moves

mega man 2 menu on NES (mesen core):

[down+k] + [k] = cursor moves

One frame difference between the two. It would be great if you could take a closer look / try my example.

I haven’t had the time to test yet, but last time I tested Mega Man 2 in the menu it definitely only needed a single frame advance to show the result. Maybe the response depends on which menu view you’re in. I tested on the view with the blue background where you choose between going to the stage select screen or entering a password. Try pausing there, then pressing/holding the Start button and then ’k’. It should respond immediately.

I just did a quick test: Mega Man 2 takes two frames to respond on the stage select screen, BUT only one frame on the Start/Password select screen. Also, single frame response on the very first screen with text when you start up the game.

Thanks, I could replicate it now!

Hi…i am a new user here. As per my knowledge RetroArch and Canoe themselves should both produce a response on the third frame after receiving input, since that’s how Super Mario World behaves even on a real console. We know for a fact that snes9x2010 doesn’t add any extra lag on top of this. I think it’s pretty safe to assume that Canoe and RetroArch perform the same here.

pcb assembly service

A new feature of RetroArch 1.7.1 is adjustable audio resampler quality. Previously this was set to Normal on PC, now you can set it lower than that, to Lower or Lowest.

You guys should check if it helps any with further reducing latency. Also note, this audio resampler quality level only applies to the Sinc resampler, it has no effect on Nearest and/or CC Resampler.

1 Like

With the official release of 1.7.1 and reports of D3D11 offering significant improvements with performance, I wonder if anyone would be willing to see what type of lag we are looking at, considering this will reduce the level of compute needed for a simple emulation box with latency in mind.

I loaded it up and I don’t see any settings for a Hard GPU sync. Does this make a noticeable difference on the D3D11 driver or are we getting it for “free” like when running KMS/DRM via Linux? I don’t have a 240fps camera/phone available but to me, D3D11 with its lack of Hard GPU sync has a feeling of more lag, even if it lets me set the frame delay higher.

I gotta say, even with a half a frame worse than Brunnis (I can only manage a frame delay of about 8), having access to my PC with its modern CPU, on a low latency monitor, and the raphnet v2, really is impressive. Makes it hard to go back to my nice big TV.

1 Like

Of course zero loop lag from an emulator will be ideal, but under 100mS is totally adaptable hence playable thanks to our easily fooled brains. :slightly_smiling_face:

If you want to test how much lag you can adapt to, you can try using a shader that only displays the output of a previous frame (i.e. PREV6) to add extra frames of delay to the video output.

1 Like

Neither swapchain control or GPU Hard Sync have been implemented yet in D3D11/D3D12 drivers. So take that into consideration.

We’d like to support it though.

1 Like

I have been doing some experimentation today with RetroArch running on the Raspberry Pi VC4 open source driver (and GL on KMSDRM on the Retroarch side), anI found out that, after precise monitor refresh rate measurement, I can leave VSYNC OFF and still I get no tearing at all! Is that expected somehow? Have you guys seen it with another KMSDRM implementations?

Also, I take hard_gpu_sync has no sense on KMSDRM, right?

@Brunnis @Twinaphex

On Blurbusters there’s a pretty interesting article on how to improve input lag in emulators and match the latency of the original device. See here

Eliminate Input Lag on PC-Based Emulators: Matching the Latency of the Original Device .

And more in-depth discussion in their forum here: Emulator Developers: Lagless VSYNC ON Algorithm .

From what I read it seems a bit focused on using Windows features / API’s, so I’m not sure how well this would be suited for Retroarch. Interesting concept nonetheless.

1 Like

The biggest issue with that algo for RA/libretro, I think, is that it requires running emulators in fractions of a frame, just a few scanlines at a time. Libretro is typically set up for a single frame worth of emulation, so I’m not sure how we would be able to break that up.

I hear that the APIs that get the status of the raster (scanline position) do not always work, and fail on some devices.

It’s also mutually exclusive with using run-ahead to compensate for game internal lag, and eliminating the game internal lag is much more powerful.

That looks gorgeous!

Great to see frame time and deviation exposed so conveniently. I have a feeling I’ll be spending a lot more time playing with these stats than playing games :slight_smile:

Ah, possibly I misunderstood?

Is Frame Time supposed to display the time in ms that RetroArch spent to create the frame, or the time in ms that the frame spends on the screen?

The latest nightly on macOS seems to be showing time-on-screen: https://imgur.com/a/ngdYr

Just check the sourcecode I guess to be absolutely sure. These values were previously reported at RetroArch exit if you invoked it from the commandline, all I did was hook them up to be able to be seen ingame.

I can clearly see that it’s not showing the time it took to create the frame.

The relevant times would be:

  • Time spent running the emulator core, excluding time spent in video callback
  • Time spent in video callback uploading the texture
  • Time running the shaders
  • Time spent waiting for Present/SwapBuffers to finish
1 Like

In case you’d like to improve its behavior through a pull request, let me give you some pointers -

gfx/video_driver.c (line 2399):

Here is where frame_time gets set.

Then, later on, we set video_info.frame_time here -

This is the value that gets used in the statistics.