Best video synchronization settings for ProMotion displays?

I run Retroarch on a Macbook Pro with ProMotion display and am looking for the most suitable settings for this display.

I am unsure between the following settings:

Which of the two is better suited for the MacBook Pro display?

Does the option Settings > Video > Synchronization > Sync to Exact Content Framerate (G-Sync, FreeSync) even support ProMotion?

Which settings do you use on your ProMotion display?

Both settings are different and employed in different use cases. The first (“Black Frame Insertion”) adds a completely black picture after each frame of a game, making everything darker. May seem useless, but it tries to mimic the strobing of a crt screen, which significantly improves motion clarity. If you can counteract the darkness, it may please you; otherwise, leave it disabled.

“Sync to Exact Content Framerate” makes variable refresh rate monitors to sync to whatever is the native frequency of a given game, as retro content uses to have very specific values sometimes. In return, you achieve better frame pacing and synchronicity without messing with the audio pitch. It’s normally a very decent option to turn on, if you have the required hardware to do so. However, I don’t know if those ProMotion screens work exactly like G/FreeSync devices, albeit being similar. You’d need to test things out and see for yourself if you’re getting good pacing and latency.

2 Likes

I played sfiii3 with BFI and got quickly annoyed by the black screen that’s visible every now and then (increasing the display brightness fixed the darker image issue imo).

I tried to test pacing and latency. Therefore, I measured the frames by counting the frames via step frame and for pacing I used the deviation value of Output. Is that what @md2mcb meant by testing? Anyway, here are my results:

Baseline (Retroarch default settings)

  • Latency: 2 frames (sfiii3)
  • Output: ~119.546 Hz (8.9% dev, 2048 samples)

BFI

  • Latency: 2 frames
  • Output: ~119.489 Hz (7.8% dev, 2048 samples)

VRR + 1 Swap Interval

  • Latency: 2 frames
  • Output: ~59,595 Hz (3.3% dev, 2048 samples)

VRR + Auto Swap Interval

  • Latency: 2 frames
  • Output: ~59.595 Hz (3.1% dev, 2048 samples)

Does that mean VRR + Auto Swap Interval has the best pacing?

From what I understand -maybe someone like hunterk could confirm if that’s accurate or not, but using the step frame method as you did only measures the in-game lag inherent to the game itself. Not the actual, real world lag (the time it takes to actually show up on the monitor basically).

In other words, the different methods you tried could very well vary in terms of how much lag they produce and they’d still end up showing up as 2 frames of latency since you’re really only checking for the game’s internal lag. edit: The only exception to this, I think, is if you’re using some form of run-ahead. That might actually shave off frames using the step-frame method.

To really check lag accurately, you really need an external measurement like a high(-ish) speed camera + some kind of input LED setup -you want a precise visual cue that the input has been pressed at a specific time, completely outside of the monitor.

I’ve been wanting to make some tests myself with a 240fps camera + LED. It’s not some super high tech setup but I think it should give decent results.

1 Like

You don’t need to test the BFI, it only does what I’ve told you: helps with motion clarity. For pacing and latency, only “Sync to Exact Content Framerate” helps between the two. You don’t even need metrics, just enable the option, play for some days and check if everything is alright, without any weird bugs. Later, you can also disable the option and do another round of testing to see if you can feel any difference in your setup.