Major performance hit with VSync enabled


#1

Hello,

I’ve been successfully using Retroarch for several years now and in general everything has been fantastic. Recently, however, I started getting a lot of stutter/slow down when using the Beetle PSX HW core with hardware rendering.

I spent several days tinkering with it and tried the suggestions here and here, both of which are pretty similar to my problem. Unfortunately, the previous suggestions don’t seem to be fixing the framerate issue. If I turn off all shaders it helps slightly, but still suffers from stuttering audio and bad performance. The only “fix” I’ve found is that disabling vsync resolves the problem, but this is not what I want to have in place long term. The other Beetle PSX core runs just fine if I switch over to that. It’s a super frustrating problem since I ran the PSX HW core without this issue for over a year.

Some stats: Windows 10, I7 6700k processor, 16GB RAM, Nvidia 970 (latest drivers from 12/12, upgraded from 388.XX as part of troubleshooting), latest RA and updated cores

Performance issues happen regardless of running on my PC monitor or on my TV (1080p) via Steamlink

Any other tips or suggestions would be greatly appreciated.

Thanks!


#2

You can try invoking V-Sync through the Nvidia control panel. I had to do that with Mupen64Plus on one of my computers because of the exact same issue. I couldn’t play N64 games with V-Sync on because it was a laggy mess. I turned V-Sync off in Retro Arch and invoked it though the driver and it was smooth as silk. Don’t know why it worked, but it’s something you can try.


#3

Thanks, yes, I’ve tried that and it works. However, one of the comments in the other threads I linked mentioned that the preferred vsync solution is through RA because it’s generally better. Not sure if that’s accurate or not.


#4

I have not seen any evidence of that statement. I run my computer upstairs exclusively with V-sync turned on in the control panel and never had issues with extra input lag. There’s generally no reason to run triple buffering unless you can’t fix your V-Sync issue any other way or you are playing a Direct 3D game below the display’s refresh rate and game that has no implemented triple buffering so the frame rate will be cut in half until the GPU catches up or you would have to implement an adaptive V-Sync which turns V-Sync off when you drop below the display’s refresh rate. I have another computer downstairs that I play Descent Rebirth on and I have changed GPUs 3 times. All 3 times I had to tweak adjustments in the driver’s control panel in order to stop the ridiculous frame rate slow downs caused by some weird timing issue even though my Core i5 2320 and GTX 750 are well up to the task of running such a simple to run game. With the GTX 750 I have to have both V-Sync and triple buffering turned on in the driver because the game is unplayable with the game’s V-sync setting. My GTX 650 could use the game’s V-sync but had to have triple buffering turned on in the control panel. My oldest GPU which is the Radeon HD 4350 needed no adjustments at all. That left me scratching my head for sure. Why? I really have no idea other than it’s just different hardware and not all hardware combinations play nice with the same settings as others. So I wouldn’t sweat it if there might be any input lag.


#5

RA’s implementation of vsync is definitely superior to the driver’s, not to mention Windows’. It should be smoother and way more responsive. With that strong computer of yours, I doubt your issues have anything to do with the core’s performance or RA’s vsync in fact, and I don’t think you should enable triple buffer like I suggested to that other guy. The reason why I do it is that I run reshade on top of RA, and the reason why I told him to try is that he was being double buffer locked (double buffer vsync may halve the framerate when it can’t reach its target).

I read something about newer nvidia drivers and/or a win10 update from october breaking vsync, I would start there!