Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If there is motion blur, yes, your brain sees motion with 24 fields per second. Without motion blur, not so much. Try gaming at 24fps and then at 60fps. You will notice the difference.

I've never seen 120fps, so I can't tell you if that helps. But if you're playing back 24fps source material (movies) and 30/60fps source material (TV), then you need 120Hz refresh at a minimum, just to be able to display each frame for the correct amount of time.



Motion blur is an artifact caused by fast motion recoreded with a slow shutter speed on camera equipment or a full screen shader effect found in modern games. A monitor or tv will not be the cause of motion blur. Whilst your correct that there is a perceiveable difference between 30 and 60 hz whilst gaming it would be negligible and the most likely cause of the "this isn't right" feeling due to latency.


TVs can cause motion blur when they are doing cross frame interpolation to "smooth" the image which many high refresh rate TVs do now a days.

(edit) averaging -> interpolation


That's what I'm saying. When you watch a movie, each field (frame) is blurred. When you move your mouse, there is no motion blur on the pointer, so you need more frames per second to convince your brain that you're not looking at a very fast slideshow.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: