> 15ms is about the threshold humans can detect latency
I think this is an oversimplification. We can detect latency, meaning a difference between input and screen response, at around 13ms.
But that doesn't equate to a new frame every 13ms being perfect, latency-wise. First, it ignores all the other sources of latency such as from input devices. Assume 6ms latency on the controller, that means you have to get a new frame out within 7ms from receiving the controller input.
That aside, there's a lot more than latency involved in creating beautiful smooth images on screen. For example, humans can detect visual change unrelated to input at about 500hz or even higher, especially in grayscale. I've heard that about 0.5m, or 2000hz, gray to gray is about the limit at which further improvements would be meaningless.
Of course the law of diminishing returns applies here. The fastest screen I've tested is 360hz and I can't say that I see any benefits over a 144hz screen, although I can see the difference in artificial benchmarks on blur busters.
I think this is an oversimplification. We can detect latency, meaning a difference between input and screen response, at around 13ms.
But that doesn't equate to a new frame every 13ms being perfect, latency-wise. First, it ignores all the other sources of latency such as from input devices. Assume 6ms latency on the controller, that means you have to get a new frame out within 7ms from receiving the controller input.
That aside, there's a lot more than latency involved in creating beautiful smooth images on screen. For example, humans can detect visual change unrelated to input at about 500hz or even higher, especially in grayscale. I've heard that about 0.5m, or 2000hz, gray to gray is about the limit at which further improvements would be meaningless.
Of course the law of diminishing returns applies here. The fastest screen I've tested is 360hz and I can't say that I see any benefits over a 144hz screen, although I can see the difference in artificial benchmarks on blur busters.