(merged)
Yeah, real VGA crts may benefit from that, indeed, but real CGA/15Khz ones can set an interlaced mode by themselves so that function is almost useless there.
User side, there is a new static setting commented in config-user-optional-template.txt:
// #define ORIGINAL_FPS_TRUSTED
Uncomment that will cause the shader to trust the fps reported by the core and will take the faster path to understand what is the content fps.
Leaving that commented, the default, will cause the shader to estimate input fps; a bit less precise, bit heavier and bit laggier way, but that will work with all of the cores, so it is the default.
Indeed, not all cores reliably tell retroarch what is the current fps; eg: flycast has a setting to do that, disabled by default, I think because it causes stutters due to the core itself:
Anyway, “ORIGINAL_FPS_TRUSTED” and fps detection are preparatory to disable 2 interlacing settings when the content falls under 35 fps:
First one was already implemented:
Second one has just been pushed:
So, to recap, the new “Real interlace” setting will convert progressive input to interlaced one, the parameter will modulate the intensity of the effect and the effect will be skipped on low fps content (25 to 30).
I suppose it will be useful on screen with a low pixel refresh time like oleds or real crts, where you’re supposed to set the effect strength to the maximum to replicate the “intended” behaviour with interlaced content.