Input lag w/Nestopia - SOLVED

“In other words, with triple buffering we get the same high actual performance and similar decreased input lag of a vsync disabled setup while achieving the visual quality and smoothness of leaving vsync enabled.” http://www.anandtech.com/show/2794/2

“So there you have it. Triple buffering gives you all the benefits of double buffering with no vsync enabled in addition to all the benefits of enabling vsync” (http://www.anandtech.com/show/2794/4)

also see this: http://www.anandtech.com/show/2794/3

So, is the Anandtech article wrong? Is it written in a misleading way?

Ok, so I redid the test of Retroarch running Nestopia with the following settings: -triple buffer off in CP -vsync ON in CP -GPU sync OFF in RA

test #2 results in ms: 246 296 263 312 263 246 296 279 296 312 279 279 246 279 279 263 263 263 279 312

average: 277.55

conclusion: There is no input lag difference between 1 and 2. 1: vsync ON in CP, OFF in RA 2. vsync “use application setting” in CP, vsync ON in RA

Although, as Awakened explained, there are other benefits to using RA’s vsync method since it uses “dynamic rate control.” Although, personally, I’ve never encountered the audio de-syncing problem with Nestopia standalone. Thus it is probably preferable to go with 2 instead of 1.

I retested Nestopia standalone with the following settings: -triple buffer off CP -vsync on CP -vsync on Nestopia -triple buffer off Nestopia

2nd test with these settings, results in ms: 275 275 291 242 275 308 291 308 242 291 275 258 258 291 275 308 275 275 275 258

277.3

and the third test, this one with the exact same average as the first test using these settings:

275 275 258 242 291 291 275 275 258 275 242 258 275 242 258 324 258 275 258 258

268.15

It’s somewhat incomplete in my opinion, and if you read the comments you’ll notice people stating the same. I use triple buffering in some games too btw, If the hardware isn’t able to keep a steady 60fps vsync causes major pain since it will slowdown all the way to 30fps in many instances. I hear adaptive vsync is good but it works by disabling vsync when framerate dips below 60fps leading to tearing.

Triple buffering gives me no tearing and generally decent framerate in such games (I use it for AC4 Black Flag)

Nestopia Retroarch, 2nd test following settings: -CP vsync: use application settings -RA vsync: ON -CP triple buffer off -gpu sync off

2nd test these settings, results in ms: 279 279 263 312 279 263 279 279 279 246 296 279 263 263 279 279 279 263 296 296

277.5

conclusion: There appears to be no difference in input lag between these settings and Nestopia standalone using the settings in the previous test (the preceding post).

Finally, just for the heck of it:

Nestopia standalone. -CP vsync: use application setting -CP triple buffer: off -vsync ON in nestopia

results in ms: 275 308 291 324 308 291 275 308 308 291 258 291 275 308 308 291 258 291 275 308 291 324 308 275 291 308

295.4

conclusion: -Vsync appears to be broken in Nestopia standalone. “Use application setting” will not activate vsync in nestopia even if “vsync” is selected in Nestopia. -Forcing vsync ON in the CP and turning it ON in Nestopia fixes this.

So I appear to have found the right combination of settings to result in the least input lag possible on my system and am very happy about that. I don’t have the horsepower to do hard gpu sync (it causes performance issues), so these are the settings:

-CP vsync: use application setting -RA vsync: on -CP triple buffer: off -gpu sync: off

OR

-CP vsync: on -RA vsync: off -CP triple buffer: off -gpu sync: off But, this doesn’t take advantage of the “dynamic rate control” of RA’s vsync. So for now I’ll stick with the above unless I start to notice performance issues.

Comparing these to the fastest settings in Nestopia standalone, there’s really no detectable difference in input lag during gameplay. However, I do notice that RA is much more CPU intensive with these settings than Nestopia standalone is with the fastest settings. RA causes my CPU temp to rise and the fan has to blow much harder, while Nestopia standalone doesn’t have this affect. Is this to be expected running RA on an older system?

Ok, I just can’t seem to eliminate the A/V burp problem with Retroarch. It happens less than 5 minutes into gameplay and then happens every couple minutes after. Sometimes it seems like my frame rate drops by 50% or more when this happens. I get this problem with either of the settings I listed in the preceding post. I never have this problem with standalone Nestopia; games run flawlessly for hours on it.

I notice that my cpu has to try a lot harder when running RA with nestopia vs. just running nestopia by itself. My Core Temp readings are higher and my fan has to blow faster. Why is RA so much more cpu intensive?

I feel like I don’t have something configured properly in RA. Don’t some people use RA on Raspberry Pi? I know my computer is more powerful than a Pi, so I don’t understand why I would be experiencing this problem.

I think this is what is happening in RA (see my previous post). I just don’t understand why my hardware wouldn’t be able to keep up with RA running Nestopia when I have no such problem with standalone Nestopia.

Is there a different version of RA I could be using that is less resource intensive? I’m currently running the Windows 7 64 bit version. Is there some config option I’m overlooking?

Ok, I think I might have solved the a/v hiccup problem. (not input lag, but a sudden jarring drop in fps accompanied by horrible crackling and slow audio). This has been an ongoing issue for me with RA.

My first clue was learning that Retroarch’s vsync employs “dynamic rate control” which syncs audio and video (thanks Awakened!). Having seemingly exhausted all my other config options, I thought “what if my ‘video’ problem is really an audio problem? If I’m having problems with my audio, and dynamic rate control syncs video to audio…”

So I switched audio drivers from dsound to sdl, and I also switched from the 64 bit version of RA to the 32 bit version and have since not experienced the a/v hiccup. I played SMB for a solid half hour with no issue. I’ll need to play longer to know if I’ve solved the problem for sure, but so far things look good. What’s more, things seem slightly smoother, like maybe the fps is slightly more stable. Fingers crossed! If things work out I’ll need to determine if switching to 32 bit was what helped or if it was indeed the audio driver.

Edit: Nope. No matter what I do, I can’t get rid of this annoying hiccup. It always eventually happens regardless of version, vsync settings, or audio driver. =( I’m going to make a new thread for this since it is unrelated to the input lag problem, which is seemingly resolved.

I should say the first time I tried retroarch was because Nestopia standalone had horrible input lag with vsync ON (maybe they fixed it now?), it was a lot better with retroarch.

Sometimes I’m using a laptop which has those hiccups and frame drops here and there. Tried to update drivers without any difference. Then I went into retroarch sound settings and bumped up the audio latency from 64 to 96. It’s perfectly fluid now. Haven’t noticed much more lag: this laptop has an i7 CPU which handles hard GPU sync 1 frame fine.

I never mess with nvidia general settings, neither on my computer or this laptop.

With nestopia standalone you need to force vsync on through the control panel and then set vsync on in nestopia. If the control panel setting is set to “use application setting,” you’ll get more input lag when you turn vsync on in nestopia.

I can’t find audio latency, where is that located?

In Retroarch: Settings -> Audio Options -> Audio Latency

[quote=“Tatsuya79”]

In Retroarch: Settings -> Audio Options -> Audio Latency[/quote]

I don’t see that option. Under Audio options, all I have is:

Mute audio (off) Rate Control Delta (0.010) Volume Level (0.0 dB)

I’m using 1.0.0.2 Windows 32 Bit.

That’s quite ancient by now. 1.0.0.3 beta might roll out at the end of this week