I have always heard that the human eye is only capable of perceiving around 30 frames per second. In the PC gaming community people scramble to try and go from 50 frames per second to 55. Although, that could be little more than seeking status. When it comes to TVs people often talk about the refresh rate, and the faster the refresh rate, the better. If the human eye can really only perceive 30 frames per second, do all of the different technologies boosting the output of a device really matter? I haven't done any side-by-side comparisons on devices or anything like that. I'm just asking this question out of curiosity. Thanks in advance.