Retroarch slowing down every few seconds after GPU upgrade

I just upgraded my GPU from a HD5870 1GB to a GTX 1060 3GB and my RAM from 6GB to 8GB DDR3.

Just launched Retroarch after this and I’m having a strange issue. It happens in some games, in others it doesn’t. It’s happening in MAME, FBA and Mednafen PSX, as far as I perceived.

The games launch fine, but after some seconds of gameplay, the framerate drops from 60 to 30 FPS and gameplay becomes sssslllloooowww. This instantly goes away if I go to the XMB and then back to the game, but after a few seconds, the slowdown kicks in again.

What could be doing this? It all ran just fine before with the old card and the same settings.

It seems to be related to V-Sync.

V-Sync off, the problem is gone, but I get lots of tearing.

V-Sync off in Retroarch, on in NVidia CP, the problem reappears.

V-Sync off in Retroarch, set to adaptive in NVidia CP, problem gone, some tearing (better than no V-Sync at all though).

Try invoking V-sync through the Nvidia control panel instead of through Retroarch. This was mandatory for me with N64 emulation because I was getting major frame drops unless I disabled V-sync. I then turned V-sync on in the control panel and everything has been fine since. I have a GTX 950.

1 Like

Try making a preset for RA in Nvidia panel with Power Management on “Prefer max perf”.

1 Like

Hey thanks! The power management thing did the trick, problem solved. Even Retroarch’s own V-Sync is working again, no stuttering.

Now I’m thinking… Wouldn’t I benefit if I just turn this option to “prefer max performance” in every single game I have? I’m betting I would see less framerate drops overall, wouldn’t I?

That’s possible, it will eat a bit more power though.
In retroarch that’s about +40W~50W consumption on a GTX 770 and SNES emus I tested.
That allows more aggressive timings (Hard sync 0 and higher “frame delay” values) to reduce input lag, and heavy shaders to run fine.

I had the same problem. Here is my take on the power management settings.

  • Optimal Power - absolutely trash. maybe ok on a laptop where you really want to extend battery. This should NOT be the default on Nvidia drivers, no idea what they were thinking.
  • Adaptive - A good default choice, works well with most games.
  • Max Perf - Some games/emulators don’t run well with Adaptive so I manually toggle Max Perf for those.
1 Like

+40-50W is considerable, but the 1060 has a TDP of only 120W and my consumption on Retroarch with max performance practically locks at 25% TDP. Doesn’t look too bad. It could definitely be an issue with more demanding games though.

Thank you for the heads-up. Changed my global setting to adaptive just in case.