Time for another small update! I’ve just tested the impact on input lag from:
a) Shaders
b) “raw” input driver
I used the same test procedure as always (see original post in this thread), using a Core i7-6700K @ 4.4 GHz and a GTX 1080. RetroArch 1.6.0 was used and testing was performed using Windows 10 and OpenGL. 25 samples were taken for each test case.
Shaders
[Input lag below reported as number of frames at 60 FPS]
No shaders: 5.21 avg / 4.25 min / 6.00 max
crt-royale-kurozumi (Cg): 5.13 avg / 4.25 min / 6.00 max
crt-geom (Cg): 5.22 avg / 4.00 min / 6.25 max
crt-geom (GLSL): 5.08 avg / 4.00 min / 6.00 max
There was no difference at all in the amount of input lag between no shader and using shaders. The average, minimum and maximum measured input lag was the same (within measuring tolerances). This means you can use shaders without worrying about introducing extra input lag.
For another data point, I also tested the crt-aperture GLSL shader on my Pentium J4205 system running Ubuntu 16.10 in DRM/KMS mode, using the built-in Intel graphics. I measured input lag with my usual test routine and just like my tests of the other shaders on the GTX 1080 in Windows, input lag performance remained unchanged after activating the shader.
One thing to remember, though, is that running the shader passes takes additional time. In other words, the time required to generate each frame will increase. If you’re using the Frame Delay setting to reduce input lag, you will likely have to decrease the value in order for your computer/device to still be able to finish rendering the frame on time. With my i7-6700 @ 4.4GHz and GTX 1080, I had to turn frame delay down from 12 to 8 when using the crt-royale-kurozumi shader, therefore increasing input lag by 4 ms.
So, while shaders themselves don’t add any extra input lag, the increased processing time might force you to reduce the frame delay which will have a small impact on input lag. The good news is that you’ll know exactly by how much your input lag increases since it corresponds to the amount of milliseconds frame delay you have to remove in order to retain 60 FPS.
EDIT: There might be more to this than I initially thought. See post below by hunterk where he shows results from his own testing, clearly showing a negative impact on input lag with some shaders. I have not been able to reproduce this, despite running additional tests (this post has been updated with those additional results).
The “raw” input driver
The raw input driver was introduced in RetroArch 1.6.0 and the hope was that this driver would reduce input lag. Until today, however, no tests had been run comparing it to the default dinput driver.
Unfortunately, my tests show that the raw input driver provides zero difference in input lag. At least it’s not measurable with this test method and equipment.
By the way, on a completely unrelated note, why does the menu shader get deactivated whenever you load a shader preset? Seems strange that such a basic thing as using a shader disables the beautiful shader used for the menu background…