This is entirely off-topic, but can you expand a bit on this latency difference that you have observed with Dark Souls 2? I’m playing that right now and I’m using Fullscreen + V-Sync OFF + Scanline Sync with RTSS to minimize latency. It’s working nicely so far, but now I’m wondering if there’s an even better way to shave off some extra frames.
Speaking of your issues with the Vulkan driver in RA, ever since the bug with the RTSS layers got resolved in the last beta versions of the app (see here for further reference: https://github.com/libretro/RetroArch/issues/9668), I have not noticed any oddity with the frame pacing. That being said, I’m not using a VRR screen and I keep V-Sync active with RA, so it could be that there is still some inherent issue with Vulkan and variable refresh rate situations.
What OS/APIs are you using?
I found that smallest input lag is possible on GNU/Linux in KMS/DRM mode, Vulkan video, max_swapchain=1, no vsync, ALSA audio with 32ms or smaller buffer.
However it needs to autodetect if the panel you are using is gsync/freesync or a regular 60hz.
I have 2 screens and everytime i switch i also have to manually turn the feature on or off. Both screens will stutter if you don’t have the correct setting, which is ON for the 240hz gsync monitor and OFF for the 60hz TV.
I’ve actually done quite a lot of testing with Frame Delay and it always seemed to produce pretty much exactly the expected result (i.e. input lag reduced by the corresponding number of ms). I did all my testing with the RetroArch gl video driver and vsync on. No idea what’s going on in your tests. 12 ms should be easy to detect with a 240 FPS recording. How many recorded samples do you use for producing a result? I usually use at least 20 samples to get a sufficiently stable result.
Maybe you have a frame limiter somewhere (RTSS or nvidia’s in-driver limiter) that messes with frame delay.
Also, I believe frame delay is a method to reduce vsync lag. Without vsync, frame delay does not make sense. What it does it run the emulator’s single-frame tick closer to the vertical sync time (“vblank”). That’s all it does. If there is no vsync, then it makes no sense to use it, I think.
I think this might be wrong. Frame delay could also use the requested emulator core FPS as the target to sync against, not just the vertical sync. Did anyone else measure lower latency with frame delay and vsync OFF?
I just tested with vsync OFF, and even though sync to exact content framerate works, the result is jittery.
So I would say it doesn’t work. Yes, the FPS indicator shows “60.1” instead of “60” for SNES games. But without enabling vsync in RetroArch, animation is not smooth. It appears to me RA uses vsync for correct frame pacing. So even though the framerate is correct (60.1 instead of 60), frame pacing is not.
I use a native g-sync display on Linux with an nvidia GPU. Frametime graph is flat and looks identical to vsync ON. But animation is jittery.
It’s easy to spot during the room transitions in Super Metroid with the Snes9x core, for example. The fast scroll animation when entering a door is 100% smooth with vsync on. With vsync off, I can clearly see jitter. It’s harder to spot with slower scrolling.
It’s vsync OFF. And yes, it’s 60FPS. I can easily tell when something is 30FPS. Every frame is there.
And no, Vulkan for me (even on Windows where I dual boot to play modern games) is excellent. I’m on a 980 Ti with a native g-sync monitor. Never had any issues with Vulkan.
I suspect this an nvidia vs amd thing. All the advice on blurbusters about g-sync is spot on for me. Vsync does exactly what the articles claim it does. The takeaway here is that those articles are called “G-SYNC 101”, not “Freesync 101”
Not sure if trolling. The data and tests that are provided by the articles are numerous. Every combination of vsync on/off with various frame caps were tested. Thousands of individual tests were performed. It took months to perform all the tests to finish those articles. There’s 15 of them, if you haven’t noticed.
I did a quick test again as I’m on win10 now, new Ryzen PC.
240fps recording (1 frame = 4.167ms), bottom of the monitor (lg32gk850g).
MAME (1 frame reaction, unibios boot settings):
vulkan gsync
5 5 6 5 6 7 5 4 7 8
5.8 = 24 ms
FCEUMM (1 frame reaction, runahead 1 in smb):
gsync vulkan
7 7 5 4 5 8 5 5 7 4
5.7 = 23.75 ms
gsync vulkan no shader
6 3 8 5 6 5 7 4 5 6
5.5 = 23 ms
So MAME is good since the fix Energy-t did.
I’m a bit faster than my older tests on win7, same range but more lucky low values it seems.
My new computer is acting the same on that gsync monitor + nvidia GPU, good to know.
I’m on Windows 10 and with a gtx 960 and i get the same behaviour: with vulkan if I disable v-sync framerate goes to 30 (even if it shows 60).
Also, I do not have a VRR monitor and I’m not sure what’s the actual pacing but: if I use vulkan + vsync + sync to exact frame rate RA shows the correct FPSs and have no tearing, if I do it with d3d11 of glcore it does tear…
In the video, the animation during the room transition looks perfect in the video. However, it doesn’t look like that in real-time. There is jitter in the animation that is not visible at all in the video.
But this video proves that I use vulkan on Linux with vsync OFF (when in RA menu you can see the FPS going above refresh rate, which is 144Hz in my case) and there is no 30FPS cap or anything.
What does VRR have to do with frame pacing? VRR does not fix frame pacing issues. It has absolutely nothing at all whatsoever to do with frame pacing. VRR will just update the display (by ending the vblank) when the game presents a new frame. If a game has frame pacing issues, VRR is not going to help you.
And in some rare cases, gsync ON with vsync OFF gives me temporary tearing near the bottom of the screen. This is with a 120FPS cap on 144Hz. One game that does this for me is Prey 2016. Another one is the inventory and map screens in Witcher 3. When closing the inventory or map screen, during the transition from the menu to the main game, there is a split second where tearing is visible after the game engine stalls the framerate for about half a second. When LFC tries to recover from this 0FPS situation, you can see a few frames that have tearing near the bottom of the screen.
So please come down from that high horse of yours and give some consideration to other people’s input and experience.
Not on Linux. If I force vsync off, FPS goes above refresh rate, as shown in the video.
I don’t know how to demonstrate this. I recorded a video, and the jitter (not stutter, just jitter, like when viewing a 24FPS movie on a 60Hz display) is not visible in the video. I don’t have a 240FPS phone camera. I have a Nokia 5.3, and it can only record 30FPS.
So you’ll have to take my word on this. I am not lying.
Yeah. There is no other info available, so I just referred to the documented behavior. If that changed now, great. It wasn’t mentioned anywhere. All I can say is that I found one case, where turning vsync OFF results in jitter. That’s all. Take it for what it is.
Yes. And what “multiple settings?” The only setting that needs changing is vsync. I toggle it on and off.
Right. I guess all the people out there who have mentioned this occurring on their machines are imaginary:
And there’s plenty of google hits for “g-sync vsync off tearing.” It’s a known thing. It has been known forever. I’ve seen it happen in the past, still happens today. And I still see people to this day on the blur busters forum (I’m part of the admin team there and thus see a lot of posts) reporting that without vsync, gsync can still tear. And always the same report: “temporary tearing near the bottom of the screen.”