Well, if you read stuff about visual perception, you’ll learn that all humans aren’t equal, basically only a small part of humanity can really perceive 60 images per second or above. You are probably in the “have-not”, like me.
That’s just one of those myths that gets perpetuated again and again. Pretty much everyone can see higher frame rates, at least on LCD screens where higher Hz and FPS translates to lower motion blur. This test demonstrates it:
With 30 to 60, images appear as higher resolution during motion. The higher the FPS (and Hz to match) become, the clearer the image gets, asymptotically approaching the perceived resolution of a still image.
(This doesn’t apply to CRT displays. It is indeed more difficult to tell apart higher refresh rates and FPS there. Well, except for the less flickering you get, of course.)
My point was that the closer you get to 60, the less people will see the difference, comparing 30 & 60 is not really invalidating it. Actually, your test shows that the difference between 15 & 30 is way more obvious than the difference between 30 & 60, i would be curious to see the difference with shorter steps (let’s say between 45 & 60).
Difference between 60 and 120 is huge. Pretty much the same as the difference between 30 and 60.
However, 45 is not something you can use as a comparison, since it induces more stutter than 30. In many ways it looks worse (unless you had a 90Hz display.)
In order for the framerate to be smooth it has to sync with the screen’s hz. 45fps on a 60hz display will look worse than 30fps because 30fps is basically 60/2 so the frames are evenly distributed. 45 can’t fit evenly in 60 so you will get an uneven frame rate with random stutters, these look like frame time/pacing stutters. GSYNC solves this issue and 45fps can look smoother there though.
I have a 240hz screen and 40fps can be distributed evenly (240/6). And i can tell the difference between 40 and 60 easily. As for 60 to 120 the difference is night and day. I can even tell the difference between 120 and 240 but only because there is less LCD ghosting/blur.
I think we hijacked the topic
Maybe a moderator could split these posts in a new thread in the off topic section?
Maybe for you, while the 30/60 difference is clearly noticeable for me, i think it’s really so-so compared to the 15/30 difference, and that’s on a fast & infinite scrolling effect over 1920p, the kind of effect where it will be the most noticeable by far. Transposing this to the current case seems far fetched, i still don’t think many people can really notice the framerate issue mentioned here, i certainly don’t, but the fact i’m not sensitive to framerates isn’t something new.
Do you have a 120Hz display? Otherwise, you wouldn’t know if you can tell or not. In my case, I also thought that 60 is the perfect target and I wouldn’t see anything getting that much better with 120. But when I upgraded my monitor, the new one did support 120, and I was immediately proven completely and utterly wrong. Not only could I tell a difference, but after I got used to 120, using the older 60Hz display made me wonder how I was able to put up with it in fast paced content (like racing games, first person shooters, or modern side-scrollers that aren’t locked to 60.)
Not so much because “I can see more frames”, but because of the higher resolution that you get during motion. It is very easy for me to tell the difference between a 480p image for example that is being upscaled with a blur filter to 1080p and a native 1080p image. And I think it’s not a stretch to say that most people are also able to tell the difference. During motion, 60FPS@60Hz in a 1080p game looks like 480p. With 120FPS@120Hz it looks like 720p. The faster the motion, the worse the resolution gets.
In any event, it is very hard for me to imagine that most people are not able to differentiate between 480p and 720p, and by extension (since this is what motion resolution translates to with higher refresh rate plus higher frame rate) between 60FPS@60Hz and 120FPS@120Hz.