An input lag investigation

Have you tried few other emulators? I tested 240p test suite on genesis gx and several snes cores, and genesis gx seems more responsive than snes cores when I tested it on windowed vsync in windows 10. Hard Sync only changes for me when I use exclusive fullscreen with vsync. I tested genesis gx lag test on my laptop plugged to a hdtv with 2-3 frame delay and I mostly get excellent reflexes below 0.5 frames delay with vsync off. With Vsync on to both my laptop (with its own display) and my desktop (pc monitor) adds latency to it, and hard sync somewhat helps, but not as fast as without vsync. Have you tried testing it without vsync? It helps a lot for me, despite getting tearing.

Thanks for sharing your results.

I still wonder about the Hard sync Linux case… could it work on your main i7 PC with a second boot Linux system?

Thank you for doing this, I would also like to see if the results varied much if at all with different controllers and if vsync on / off has any effect. On a purely personally subjective note I have been spending a good deal of time lately playing SNES Super Mario and the MegaMan games and they really don’t “feel” like they are lagging at all. When I die it always feels like me just sucking and not the controls or the emulation input lag. I can make precision jumps and I can miss them just like I did when I was a kid. I also do not have any fancy hardware, just an average run of the mill monitor and a Samsung TV.

So input lag is reduced on a Windows 10 machine by 1 frame when using the Hard GPU Synchronization?

If I want to enable it, is there another setting that I should adjust as well? Or just turn it on and I’m good to go?

[QUOTE=Spooniest;36562]So input lag is reduced on a Windows 10 machine by 1 frame when using the Hard GPU Synchronization? If I want to enable it, is there another setting that I should adjust as well? Or just turn it on and I’m good to go?[/QUOTE] The way I understand it that’s all you need to do. Vsync should be off unless you have really bad screen tearing, that’s the tradeoff. One other option you can try playing with is the Frame Delay, adjust it upwards til you hear sound crackling then dial it back til the crackling goes away. Not sure how much of an effect if any at all that has on input lag. This is purely just how I understand things from reading around here, I am sure someone here would have more and better info than I.

This monitor supposedly has almost no input lag (~1 ms)

The current best monitors are around 9ms in the input lag department so i don’t think that one will be anywhere near 1ms.

This was already discussed so often, but also so often been ignoried. Maybe it is becaue there are so many computer setups out there. Maybe because persons who are using emulators do not have original hardware to test on crt.

You could also test with the 240p test suite, manual lag test. Its a little bit simpler setup than yours.

After a few test in the past I get around 2.0 frames / 34 ms (with and w/o the use of crt shaders, makes no difference for me). My monitor was causing 12ms of own LAG the rest of 22ms where caused by the emulator/pc/vga or testing faults by my self. Thats still too much LAG IMO, but also depents on the game you are planing to play.

Yes, this is true. I had an Asus VG248QE, which the first line of GSync monitors was based on, and I measured its latency at 15 ms using a professional calibration tool.

I guess i’ve just been fortunate enough to never really notice this stuff (not to say its not important or an issue for others), it’s all micro-seconds to me. If I was a world champion Donkey Kong Player I might notice more I guess :stuck_out_tongue:

Without vsync, there WILL be tearing, won’t it? Is there a way to archieve non-tearing scrolls in LR/RA without vsync?

[QUOTE=vanfanel;36596]Without vsync, there WILL be tearing, won’t it? Is there a way to archieve non-tearing scrolls in LR/RA without vsync?[/QUOTE] My plan to eliminate tearing is to get a G-Sync monitor. I know Mame has triple buffering but that reintroduces input lag, not sure is Retroarch has something similar to remove the tearing. Vsync also has one other potentially bad side effect and that it forces games to run at 60hz and in some cases that is not the proper game speed and can throw the game off at varying degrees. For consoles I don’t believe this is too big of an issue but for arcade games it can be a much more serious issue as those games ran at much larger variety of refresh rates. Again I am not claiming to be any sort of expert on all this, I am only going by what I understand of what I have read while trying to learn all of this. Maybe someone with more knowledge can correct me if anything I have said is wrong.

@lordmonkus: The games are not forced to run at 60HZ, but at your monitor’s exact physical frequency for the video mode you’re using. That can be 59.019HZ or 60.017804HZ. That’s externally. Internally, the game is emulated at the system’s original speed. Audio is resampled to cope with the number of samples needed to keep up with the video frequency. And this is what makes RetroArch SO incredible for me: smoth scrolls no matter what your phisical refresh rate is. Standalone emulators ALL have problems with this and show video hiccups or audio gaps/crackling due to not doing this.

Did a quick test yesterday using the keyboard instead of the controller and got the same input lag figures.

[QUOTE=Spooniest;36562]So input lag is reduced on a Windows 10 machine by 1 frame when using the Hard GPU Synchronization?

If I want to enable it, is there another setting that I should adjust as well? Or just turn it on and I’m good to go?[/QUOTE] Input lag compared to Windows 10 without video_hard_sync is actually reduced by 3 frames. You just turn it on and you’re good to go.

Not yet. It’s set to its default value (“Late”).

I thought so as well, until I found Prad.de’s testing. They seem to have gone to pretty extreme lengths to investigate and measure input lag on monitors. Below is a link to a very comprehensive write up. Long story short, the common ways of measuring input lag introduce an uncertainty which may lead to several milliseconds and up to a full frame time of error. For example, this is a big issue with the CRT-side-by-side-with-LCD method, where the analog output from a consumer graphics card generally does not seem to be fully synced to the digital output. They end up using a photo diode in conjunction with a high-speed oscilloscope patched into the actual TMDS display signal, which is probably as accurate as it can get without analyzing actual transistor voltage levels. Here’s a paragraph from Prad’s conclusion:

“Nonetheless, the question as to what exactly should be measured is a difficult one: the time until any change on the monitor is visible or only the time the electronics require until the respective pixel is controlled? As a first approximation, this difference was determined for the Samsung 2494HM and is reflected in this special case in a complete denomination. Whilst a total lag of five to six milliseconds was established via photo receiver, including the response time of the panel, it takes just 0.6 milliseconds until a pixel changes its condition in a measurable manner. The measurement could also be refined further by tapping the voltage applied to the respective transistors on an open monitor, which completely excludes the sluggishness of the liquid crystals.”

Page 9 of the HP Z24i review where a signal delay of 0.8 ms is reported (it’s in german, under heading “Latenzzeit”): http://www.prad.de/new/monitore/test/2013/test-hp-z24i-teil9.html

Obviously, the photo diode is much more sensitive than the naked eye, so actual pixel response time will determine how quickly I can see the change in my video clips. It shouldn’t take more than a few milliseconds before the change starts being detectable, though. So, all in all, it should have a rather negligible impact for this particular analysis.

Regarding the 240p test suite: I didn’t know about this previously. Ooops… However, I tested it pretty extensively yesterday and the results don’t really make any sense. Here’s what I arrived at:

Win 10 + video_hard_sync off: 4 frames Win 10 + video_hard_sync on: 1 frame RetroPie: 3 frames

No difference could be detected between bsnes-mercury-balanced and snes9x-next. The differences between the different platforms/test cases are exactly the same as the ones I measured with the camera method, except for the fact that all three are exactly 5 frames quicker in the absolute sense. For example, I measured 6 frames for Win 10 with video_hard_sync enabled with the camera method, but got 6-5=1 frame with the manual lag test. For RetroPie, I measured 8 frames with the camera method, but got 8-5=3 frames with the manual lag test.

I did a quick sanity check by testing Super Mario World (the first one) and Mega Man X with the camera method. Both reacted in the same 5-6 frame window as I saw when testing Yoshi’s Island.

I will do the manual lag test while filming the screen and controller to try to determine if I’m systematically pressing the button prematurely or if the lag test simply does react quicker for some reason. Does anyone have any other theories?

Do you have real hardware and crt for testing? Maybe the games are having this lag by design?

Since I do not own a digital camera I only could provide you some iphone videos on real snes and crt.

Edit: And the input lag from your monitor should be around 10ms. Since the uses LAG values are the g/g ones imo:

[QUOTE=xadox;36617]Do you have real hardware and crt for testing? Maybe the games are having this lag by design?

Since I do not own a digital camera I only could provide you some iphone videos on real snes and crt.[/QUOTE] Unfortunately, I have neither an SNES or a CRT at the moment, but I might be able to get access to both at a friend’s. I’ll see what I can do about that.

Actually, 10 ms is for a full transition from one color to a second one and back to the first (they specifically separate the rise and fall times). According to PRAD, it takes 5 ms for each transition. So approximately 5 ms will be needed for fully changing from one color to another. However, while doing frame-by-frame analysis of a video recording, a change is visible far sooner than that. For the sake of this analysis, it effectively eliminates a big part of the response time. A 0.8 ms signal propagation time plus 2 ms for being able to detect a faint change is probably a reasonable assumption. Again, pretty insignificant when discussing delays of around 100 ms.

Of course you are right. But it is important to know what kind of value users are throwing if they are talking about LAG. Like said before. Most of the emu users are just lucky if they where able to play the game since they have no alternative.

Arount 10 years I also just played them on an emu. Now with real hardware in direct comparison I was just dissapointing. It just feels wrong, also if we are only talking about 50ms.

That would be great.

@vanfanel Thanks for clearing that up. I was fairly certain that vsync did not mess with the timing of games with Retroarch at all but I wasn’t positive about it. Do you know if the stand alone Mame is affected by vsync in this way or not ? I seem to recall seeing videos and discussions about this and how certain arcade games because of their really oddball refresh rates being so far below 60 hz natively that running with vsync on makes them run too fast.

Sorry for the double post but for some reason I cannot make a post without everything running into a single paragraph and becoming a wall of text. @Brunnis If you could get your hands on real hardware and games and do a comparison that would be awesome. I would love to know if some of these games just have bad laggy controls programmed into them. I just remember as a kid when you got a shitty controlling game and it just felt awful. Back then we didn’t call it input lag, we just called it sluggish controls and games would get back reviews because of it. Somehow out of all this personally I believe in the end as long as we can eliminate the egregious unnecessary input lag and get it sufficiently fast enough it is just as good as real hardware on a CRT. I know some people out there say beating Tyson in Punch Out or doing the tunnel level in Battletoads is near impossible in emulation but they can do it on real hardware and that these are the best tests for input lag. I was never good enough to do either of these back in the day anyways so I am not a good judge of that. Just because something is measurable with extremely sensitive equipment does it really make any real difference at the end of the day. I am not saying that input lag isn’t a potential problem and I would love to eliminate as much as possible but what point does it become a complete waste of time, energy and money trying to get rid of that last 0.1 ms.

I don’t know how current standalone MAME works, but in the past it used to have these horrible hiccups all emulators had due to refresh rate differences between the games and the physical screen. Some MAME hacks allowed to run them at the physical screen refresh rate, but without a resampler it caused problems like wrong sound pitch, etc. I was tired of that crap so I was only using FPGAs, but then I discovered Libretro/RetroArch and I found a new “home” :smiley:

Also, I have cleared the ship part in Battletoads, the one with the incoming coming walls, in the Gambatte core, several times. Always on an X-less Raspbian with no unnecessary services running, on the Pi/Pi2 (GLES on Dispmanx rendering context, UDEV, ALSA). I haven’t noticed any more input lag than I would notice on a real GameBoy back in the day and that part is playable for me.

[QUOTE=vanfanel;36635]I don’t know how current standalone MAME works, but in the past it used to have these horrible hiccups all emulators had due to refresh rate differences between the games and the physical screen. Some MAME hacks allowed to run them at the physical screen refresh rate, but without a resampler it caused problems like wrong sound pitch, etc. I was tired of that crap so I was only using FPGAs, but then I discovered Libretro/RetroArch and I found a new “home” :smiley: Also, I have cleared the ship part in Battletoads, the one with the incoming coming walls, in the Gambatte core, several times. Always on an X-less Raspbian with no unnecessary services running, on the Pi/Pi2 (GLES on Dispmanx rendering context, UDEV, ALSA). I haven’t noticed any more input lag than I would notice on a real GameBoy back in the day and that part is playable for me.[/QUOTE] Yeah I love Retroarch and I use it for most everything nowadays except I do prefer stand alone Mame still. I think sometimes when people bitch about emulation and bring up input lag and saying things like certain games being unplayable either don’t know how to set up their emulators or they just have a hatred for emulation and are unwilling to admit that it could ever possibly be just as good as original hardware.