Recently noticed frame pacing issues when using VRR in Retroarch

Setup: 13600k, RTX 4070 546.33 driver, 32gb DDR5 6400, Windows 11 22H2

RetroArch 1.16.0 (Git 191ca8d)

Alienware 2521HF 240hz Freesync monitor

I’ve recently noticed that using Freesync instead of traditional vsync causes some additional frame time hiccups in Retroarch on my setup. I use the Genesis Plus GX core with the 240p test suite drop shadow test to verify this since I’ve never had issues with that core. With my display locked to 60hz and regular vsync enabled, the shadow pulses at an even cadence with no interruptions. When I switch to VRR at 240hz native refresh, the drop shadow pulses become more uneven and sporadic.

The problem doesn’t seem to be isolated to one core. For example, in Mega Man X4 (Beetle PSX core), the scrolling text when selecting either Mega Man or Zero will stutter quite a bit with VRR but will be mostly smooth with 60hz traditional vsync.

I’ve tried switching from vulkan to gl but it didn’t help. I’ve also tried using an RTSS fps limit which didn’t help.

Has it always been like this or am I just noticing it now? Has anyone else noticed this?

Are you disabling vsync when you set ‘sync to exact content framerate’? Vsync needs to be enabled and the refresh rate set to your monitor’s native/max refresh rate–in this case 240 hz.

No, vsync is still enabled and the monitor is at the native 240hz refresh rate when using VRR. I should add that freesync seems to be working normally in regular games.

You mean video_refresh_rate in the retroarch.cfg? This value needs to be “240.0”

Right now it’s technically at 239.757004 which is what I assume Retroarch is detecting itself.

edit: Letting Retroarch sample it for awhile changes it to 239.578339.

and it’s still stuttery? If so, that’s all I got, but maybe someone more knowledgeable about this can help out, such as @Tatsuya79

Yes it still not as smooth at a 60hz regular vsync. It’s possible it was always like this and I never noticed but when doing a drop shadow test I can tell a difference.

it should definitely be smooth with ‘sync to exact content framerate’ and a freesync-compatible monitor with those settings. Something must be wonky, but I’m afraid I don’t know what it could be. I guess make sure your GPU control panel is set to “let the application decide” or whatever for all sync stuff. I know you can use scanline sync stutter-free, as well (or instead, I guess) but I don’t know any details on that.

Yeah I’ve used Retroarch for a long time now with VRR displays and it seemed to be as smooth as regular vsync so I don’t know what changed. My control panel is setup correctly and VRR still works fine in regular games.

I’ve noticed small judders in the 240p test suite video scrolling test in various cores ever since I built my new PC and got an LG C2 to replace my IPS monitor. If I use swap interval 2 at 120hz with sync to exact off, those smaller more regular judders go away, but I get a longer more noticeable stutter every once in awhile.

I never noticed the VRR judders on the IPS monitor. I thought maybe the G-Sync module in that was doing something special that the HDMI Forum VRR on the C2 can’t do. Or maybe OLED’s fast response shows the judders more clearly than IPS. At this point I just try not t look for them and they don’t bother me much anymore.

I have an LG screen with the gsync module, it’s smooth in 240p-suite genesis+gx.
That’s my only setup for that, can’t really do much.

Are those small judders when using VRR? Do they go away if you set the display to 60hz with regular vsync?

I had a Alienware 2521H 360hz G-sync native display up until a month ago. In Retroarch I never noticed a difference, but it was smoother in games using frame generation like Cyberpunk 2077. Unfortunately I had to get rid of it due to eye strain so I can’t test it now.

Yes to both. Though running at 120hz with swap interval 2 is better since you get the same frame pacing as 60hz with less display lag.

I would like to resurrect this thread as I believe this issue is still relevant. If people with VRR displays can share their results with the 240p drop shadow test so we can have more data points, that would be beneficial.

A CRT shader that implements interlacing (like my Scanline Classic Shader will look terrible with VRR enabled as it relies on smooth frame timing to give a strobing effect when an interlaced signal is detected.

If it’s true that true G-Sync displays and VRR / G-Sync compatible displays give a different result with the same settings, that should be noted.

On D3D12:

I can get smooth strobing if make a custom resolution for the exact framerate through Nvidia Control Panel, enabling Vsync in RA, and setting the exact framerate in RA.

With V-sync on, setting RA to 120 Hz without ‘Sync to Exact Refresh Rate’, and Swap Interval to Auto, I get decently smooth strobing with a consistent judder. VRR being enabled on the TV doesn’t matter for this.

With same settings but ‘Sync to Exact Refresh Rate’, I get more judder, with some frames being held so long the drop shadow clearly appears as a solid object.

I haven’t tried V-Sync off yet, but I was under the impression that was required for VRR. There may be some misconceptions in how the settings should be configured.

If I have time, I will also try the Vulkan driver. If anyone wants me to take video, I can do that as well.

Using other techniques like shader subframes is not a viable solution as the performance requirements seem high. At least, I can’t get audio to sync smoothly when so many video frames are getting generated, and I have a very fast system relative to what most people are using for RA. It seems completely unnecessary for what I’m trying to accomplish anyway.