Input lag w/Nestopia - SOLVED

Have you tried it with triple buffering and vsync turned off in your control panel and vsync turned on in retroarch (with or without hard gpu sync, which should only remove <16 ms at best)?

I wonder why even use triple buffering. RetroArch doesn’t even have frameskip. Triple buffering is only useful on performance limited scenarios, allowing games to run at sub 60fps without dropping all the way to 45/30/15 fps. Honestly you should only mess with CP settings in cases you can’t run at steady 60fps and the game doesn’t offer a triple buffering option.

Your measurements might be accurate but I wonder, are you running in full screen? Windowed mode with aero can cause issues with vsync.

Anyway I tried the reflex test, I was amazed at my consistency!

246 246 246 279 246 279 246 246 263 246 279 263 263 263 246 246 230 230 230

That was with Retroarch in Windows 8.1 using OpenGL, full screen, vsync and hard sync to 1

With hard sync 0 it was even more consistent and got even better results

213 197 213 213 230 246 213 197 197 197 230 279 230 213 213

Went ahead and tried with hardsync off and it worsened, 279 accross the board with an oddball 312 here and there but pretty consistent again.

There are many factors affecting this by the way. The controller you use, the screen you use (since it’s an HTPC I guess it might be a TV, try gaming mode if available), the driver combination, and to that you’re adding your custom CP settings.

turning vsync off in my CP overrides RA and forces it off, resulting in screen tearing and making the game unplayable for that reason.

I have confirmed that I’m not running anything in windowed mode. I’m using an original NES controller connected via a USB adapter. I am using game mode on my TV and it is rated as having 25ms of input lag by displaylag.com, although since I am looking for the source of the difference in input lag between Nestopia in RA and Nestopia by itself running on the same system, lag added by the controller and the display is the same for each and can be ignored.

I did some more test and there were some interesting findings.

Disabling vsync in CP and enabling it in RA showed no improvement and resulted in screen tearing, making it unplayable anyway.

The first interesting result is that disabling triple buffering in my CP - much to my surprise - resulted in a reduction in input lag. I thought I had the lag-reducing kind of triple buffering since I have an nvidia card, but apparently even nvidia will make the mistake of referring to “render ahead” as “triple buffering.” This confuses the hell out of everyone- ONLY “page flip” triple buffering should be called triple buffering, but I guess it’s way too late for that now.

The second interesting result is that Nestopia Retroarch STILL SHOWS a significant input delay compared to Nestopia Standalone. Here are the results:

Nestopia RA -same settings as first test except: -triple buffer OFF in CP -vsync ON in CP -vsync OFF in RA -GPU sync off RA

results in MS: 263 296 279 296 263 329 279 263 312 279 263 279 296 279 279 263 296 312 263 263

average: 282.6

Nestopia, standalone same settings as first test, except: -triple buffer OFF in CP -vsync ON in CP -vsync ON Nestopia -triple buffer OFF Nestopia

results in MS: 24 2 275 258 275 258 234 258 275 258 275 258 258 291 275 275 275 258 275 258 242

average: 268.15

The difference in the average is 14.45 ms. The lowest result from Nestopia RA is 263 compared to 242 for Nestopia SA. The highest result is 329 for Nestopia RA and 291 for Nestopia SA.

We can conclude from the above that triple buffering is not the source of the input delay difference between Retroarch running Nestopia and standalone Nestopia. We can also conclude that triple buffering as implemented by my graphics card adds to the input delay.

So, the question remains open: what is causing the additional input delay in RA?

Can’t tell… I can’t reproduce the issue and I certainly don’t notice any input lag but my numbers are markedly better whan yours. And yeah told ya triple buffering adds one frame of input lag. It’s not important on current gen games but it’s really noticeable in twitch based games and platformers.

Ok So I tested Nestopia in Retroarch again, this time with the following settings: -everything the same as the previous test except: -in control panel: vsync set to “use application setting.” -in retroarch: vsync set to ON

results in ms: 263 279 263 312 263 279 263 296 279 279 263 246 296 296 279 263 279 263 263 279

average: 275.15

These are the best results attained in RA thus far, and less than 1/2 a frame additional delay compared to the best results attained with nestopia standalone (see previous test). This is so close to the results attained in the previous test of nestopia standalone that I will have to do 1-2 more tests.

I also will have to play for at least an hour with these settings to confirm that no performance issues result.

Andres: Triple buffering per se does not add input lag. “Render ahead” triple buffering adds input lag; “page flip” triple buffering reduces it. The problem is that both methods are referred to simply as “triple buffering” by both graphics card manufacturers and developers. That’s my understanding based on the anandtech article “Triple Buffering: Why We Love It.”

I think maybe the “render ahead” method might be more common, leading to the widespread perception that “triple buffering” per se adds to input lag, but that’s just a theory.

It doesn’t really reduce it either, it’s pretty much game dependent. What it does is allow the game to run with vsync but without the video pipeline stalling the process. (ie, decoupling input latency from video rendering).

There is no way triple buffering would lower latency compared to a game running without vsync.

Skimming this thread, I was wondering if using those settings would help any. I’ve read that in some cases using the application’s vsync method instead of the Nvidia driver’s forced vsync can be better.

You might find some insight into this issue from an old thread I made about input lag where maister first added the hard sync option. Reading through that I remembered that one difference between stand alone emulators and RetroArch is that RA uses dynamic rate control to sync audio and video. I remember that being a problem back when I used stand alone Nestopia; after a certain amount of time the audio would lag behind the video. RA’s dynamic rate control fixes that. I don’t know enough about that feature to know if it could add display lag on certain setups though.

“In other words, with triple buffering we get the same high actual performance and similar decreased input lag of a vsync disabled setup while achieving the visual quality and smoothness of leaving vsync enabled.” http://www.anandtech.com/show/2794/2

“So there you have it. Triple buffering gives you all the benefits of double buffering with no vsync enabled in addition to all the benefits of enabling vsync” (http://www.anandtech.com/show/2794/4)

also see this: http://www.anandtech.com/show/2794/3

So, is the Anandtech article wrong? Is it written in a misleading way?

Ok, so I redid the test of Retroarch running Nestopia with the following settings: -triple buffer off in CP -vsync ON in CP -GPU sync OFF in RA

test #2 results in ms: 246 296 263 312 263 246 296 279 296 312 279 279 246 279 279 263 263 263 279 312

average: 277.55

conclusion: There is no input lag difference between 1 and 2. 1: vsync ON in CP, OFF in RA 2. vsync “use application setting” in CP, vsync ON in RA

Although, as Awakened explained, there are other benefits to using RA’s vsync method since it uses “dynamic rate control.” Although, personally, I’ve never encountered the audio de-syncing problem with Nestopia standalone. Thus it is probably preferable to go with 2 instead of 1.

I retested Nestopia standalone with the following settings: -triple buffer off CP -vsync on CP -vsync on Nestopia -triple buffer off Nestopia

2nd test with these settings, results in ms: 275 275 291 242 275 308 291 308 242 291 275 258 258 291 275 308 275 275 275 258

277.3

and the third test, this one with the exact same average as the first test using these settings:

275 275 258 242 291 291 275 275 258 275 242 258 275 242 258 324 258 275 258 258

268.15

It’s somewhat incomplete in my opinion, and if you read the comments you’ll notice people stating the same. I use triple buffering in some games too btw, If the hardware isn’t able to keep a steady 60fps vsync causes major pain since it will slowdown all the way to 30fps in many instances. I hear adaptive vsync is good but it works by disabling vsync when framerate dips below 60fps leading to tearing.

Triple buffering gives me no tearing and generally decent framerate in such games (I use it for AC4 Black Flag)

Nestopia Retroarch, 2nd test following settings: -CP vsync: use application settings -RA vsync: ON -CP triple buffer off -gpu sync off

2nd test these settings, results in ms: 279 279 263 312 279 263 279 279 279 246 296 279 263 263 279 279 279 263 296 296

277.5

conclusion: There appears to be no difference in input lag between these settings and Nestopia standalone using the settings in the previous test (the preceding post).

Finally, just for the heck of it:

Nestopia standalone. -CP vsync: use application setting -CP triple buffer: off -vsync ON in nestopia

results in ms: 275 308 291 324 308 291 275 308 308 291 258 291 275 308 308 291 258 291 275 308 291 324 308 275 291 308

295.4

conclusion: -Vsync appears to be broken in Nestopia standalone. “Use application setting” will not activate vsync in nestopia even if “vsync” is selected in Nestopia. -Forcing vsync ON in the CP and turning it ON in Nestopia fixes this.

So I appear to have found the right combination of settings to result in the least input lag possible on my system and am very happy about that. I don’t have the horsepower to do hard gpu sync (it causes performance issues), so these are the settings:

-CP vsync: use application setting -RA vsync: on -CP triple buffer: off -gpu sync: off

OR

-CP vsync: on -RA vsync: off -CP triple buffer: off -gpu sync: off But, this doesn’t take advantage of the “dynamic rate control” of RA’s vsync. So for now I’ll stick with the above unless I start to notice performance issues.

Comparing these to the fastest settings in Nestopia standalone, there’s really no detectable difference in input lag during gameplay. However, I do notice that RA is much more CPU intensive with these settings than Nestopia standalone is with the fastest settings. RA causes my CPU temp to rise and the fan has to blow much harder, while Nestopia standalone doesn’t have this affect. Is this to be expected running RA on an older system?

Ok, I just can’t seem to eliminate the A/V burp problem with Retroarch. It happens less than 5 minutes into gameplay and then happens every couple minutes after. Sometimes it seems like my frame rate drops by 50% or more when this happens. I get this problem with either of the settings I listed in the preceding post. I never have this problem with standalone Nestopia; games run flawlessly for hours on it.

I notice that my cpu has to try a lot harder when running RA with nestopia vs. just running nestopia by itself. My Core Temp readings are higher and my fan has to blow faster. Why is RA so much more cpu intensive?

I feel like I don’t have something configured properly in RA. Don’t some people use RA on Raspberry Pi? I know my computer is more powerful than a Pi, so I don’t understand why I would be experiencing this problem.

I think this is what is happening in RA (see my previous post). I just don’t understand why my hardware wouldn’t be able to keep up with RA running Nestopia when I have no such problem with standalone Nestopia.

Is there a different version of RA I could be using that is less resource intensive? I’m currently running the Windows 7 64 bit version. Is there some config option I’m overlooking?

Ok, I think I might have solved the a/v hiccup problem. (not input lag, but a sudden jarring drop in fps accompanied by horrible crackling and slow audio). This has been an ongoing issue for me with RA.

My first clue was learning that Retroarch’s vsync employs “dynamic rate control” which syncs audio and video (thanks Awakened!). Having seemingly exhausted all my other config options, I thought “what if my ‘video’ problem is really an audio problem? If I’m having problems with my audio, and dynamic rate control syncs video to audio…”

So I switched audio drivers from dsound to sdl, and I also switched from the 64 bit version of RA to the 32 bit version and have since not experienced the a/v hiccup. I played SMB for a solid half hour with no issue. I’ll need to play longer to know if I’ve solved the problem for sure, but so far things look good. What’s more, things seem slightly smoother, like maybe the fps is slightly more stable. Fingers crossed! If things work out I’ll need to determine if switching to 32 bit was what helped or if it was indeed the audio driver.

Edit: Nope. No matter what I do, I can’t get rid of this annoying hiccup. It always eventually happens regardless of version, vsync settings, or audio driver. =( I’m going to make a new thread for this since it is unrelated to the input lag problem, which is seemingly resolved.

I should say the first time I tried retroarch was because Nestopia standalone had horrible input lag with vsync ON (maybe they fixed it now?), it was a lot better with retroarch.

Sometimes I’m using a laptop which has those hiccups and frame drops here and there. Tried to update drivers without any difference. Then I went into retroarch sound settings and bumped up the audio latency from 64 to 96. It’s perfectly fluid now. Haven’t noticed much more lag: this laptop has an i7 CPU which handles hard GPU sync 1 frame fine.

I never mess with nvidia general settings, neither on my computer or this laptop.