Hello Cyber. Sorry, yes, I had missed that post, it’s strange because I’m following this tread pretty closely, so thanks for pointing to it.
The “Set Display Reported Refresh Rate” option doesn’t do much here. I have activated it countless times when doing these experiments “just in case”, but rationally, it’s quite the opposite. The physical refresh rate known to RetroArch (“video_refresh_rate”) is used to adjust the refresh rate of the games to the host refrest rate when VSYNC is ON and “Sync to Exact” is OFF.
But when “Sync to Exact” is ON, it’s quite the opposite: the display is supposed to sync to the emulated system refresh rate.
So, in a nutshell, there is no rational reason to think that checking “Set Display Reported Refresh Rate” will fix the microstuttering we are seeing when “Sync to Exact” is ON.
The reason for that microstuttering is already known. Again: RetroArch should re-align each next frame buffer pageflip to a more exact interval since the last framebuffer flip.
Or, in other words:
This issue is not about framerate, but about frame time (time bewteen pageflips). Frame times have to be constant, not variable.
In emulation, you can have almost constant 59.94 FPS but frametimes can be varying constantly. They must not vary so much. That’s how this issue is avoided. That’s why it’s NOT so prominent in native games.
The original explanation was already given by the BlurBusters forum founder time ago. This is EXPECTED to happen.
I will watch this thread a bit longer (this issue has been brought over and over during the last years and was forgotten again and again) and in the end I think I will have to put my money where my mouth is and try to fix it myself, because comments about unrelated options and about proprietary/closed NGreedia stuff like GSync won’t help at all since we can NOT know how that thing works and “Sync to Exact” is supposed to work correctly with Freesync too.
@Tatsuya79 Your input on this would be greatly appreciated. I don’t know RetroArch refresh rate control stuff internally at all (I did some RA graphics backends years ago, but refresh rate is not controller from there) so, would it be possible to do an approximate “solution” in RetroArch by “forcing” the cores to request buffer swaps on a more “steady” cadence (possibly “micro-pausing” the cores to deceive them) or will it require per-core intervention?