From what I understand -maybe someone like hunterk could confirm if that’s accurate or not, but using the step frame method as you did only measures the in-game lag inherent to the game itself. Not the actual, real world lag (the time it takes to actually show up on the monitor basically).
In other words, the different methods you tried could very well vary in terms of how much lag they produce and they’d still end up showing up as 2 frames of latency since you’re really only checking for the game’s internal lag. edit: The only exception to this, I think, is if you’re using some form of run-ahead. That might actually shave off frames using the step-frame method.
To really check lag accurately, you really need an external measurement like a high(-ish) speed camera + some kind of input LED setup -you want a precise visual cue that the input has been pressed at a specific time, completely outside of the monitor.
I’ve been wanting to make some tests myself with a 240fps camera + LED. It’s not some super high tech setup but I think it should give decent results.