I have always heard that the human eye is only capable of perceiving around 30 frames per second. In the PC gaming community people scramble to try and go from 50 frames per second to 55. Although, that could be little more than seeking status. When it comes to TVs people often talk about the refresh rate, and the faster the refresh rate, the better. If the human eye can really only perceive 30 frames per second, do all of the different technologies boosting the output of a device really matter? I haven't done any side-by-side comparisons on devices or anything like that. I'm just asking this question out of curiosity. Thanks in advance.
i just watched some battlefield comparisons on both youtube and dailymail and i didn't see huge differences. I play games and if frame rate is dropping or not steady it causes lag in gameplay (basically the game slows slightly,or speeds faster and slower,also visuals can be choppy). The human eye is different for everyone what you will or won't notice no 2 eyes are 100% the same. Also people like bragging rights and keeping up with the jones's.
All I know is, every 60fps game I've ever seen looks insanely smoother than any 30fps game. YouTube some videos, particularly Just Cause 3 - the PC version can run at 60fps and looks marvelous, and there are some comparison vids with the console version, which runs at about 30fps, and you can definitely see the difference. Anything past 60fps I would say is a waste, but the difference is all kinds of noticeable.