Scientists claim you can't see the difference between 1440p and 8K at 10 feet in new study on the limits of the human eye — would still be an improvement on the previously-touted upper limit of 60 pixels per degree
Measuring the pixel-per-degree of a display, researchers found the human eye can see up to 94 PPD for grey, but as few as 53 for yellow and violet.
Researchers at the University of Cambridge and Meta Reality Labs have conducted a new study into how the human eye perceives pixels on displays at different sizes and resolutions, and claim that once you get to a certain size and detail, there's no discernible difference, via TechXplore. According to the calculator they developed, at 10 feet distance, a 50-inch screen looks almost identical at 1440p and 8K resolution.
The researchers highlighted that as displays grow larger and more detailed, with ever greater resolutions, it's important to know what humans can actually see so that we aren't developing technologies that are largely redundant. But where previous studies have looked at the perceived retinal resolution, these researchers looked at what resolution the viewer could perceive with utmost clarity and no blur, indistinguishable from a perfect reference, and measured it distinctly for different colors and tones.
Using a sliding display to maintain continuous resolution control, researchers found that the human eye can perceive black and white pixels at up to 94 pixels per degree, and up to 89 pixels per degree for red and green patterns, but just 53 pixels per degree for yellow and violet.
The researchers looked into the effects of looking directly at a pixel or from an angle, the overall size of the screen, its pixel density, the brightness (or darkness) of a room, and the distance between the viewer and the screen.
In correlating all this data together, the researchers developed a calculator where you can input various factors like resolution, distance from the screen, and screen size, to see how the study would extrapolate the results. Using that data, we can see that with a 50-inch screen at 10 feet distance, the subjects of the study wouldn't have been able to tell the difference between a 1440p screen and one at 8K resolution. The calculator indicates that for a 50-inch 1440p display viewed at 10 feet, only 1 percent of the population would notice the difference between that image and a 'perfect' image. At 4K, that number becomes 0%; naturally, 8K would be the same. According to the scientists, all three resolutions would look broadly the same at that distance.
"If you have more pixels in your display, it's less efficient, it costs more and it requires more processing power to drive it," said co-author Professor Rafał Mantiuk, from Cambridge's Department of Computer Science and Technology.
"So we wanted to know the point at which it makes no sense to further improve the resolution of the display."
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Before this study, the human eye was thought to be capable of viewing 60 pixels per degree, so it raises the bar for what the human eye can perceive at certain distances and screen sizes. That could prompt display manufacturers to adjust their designs, and indeed, the researchers hope the results could help guide display development, as well as image, rendering, and video coding technologies in the future.
Although it feels like there's an argument to be made that 8K displays are still somewhat redundant when there's little native content for them or powerful enough GPUs to run them, I still feel like I can tell the difference between them and 1440p. Maybe not at extreme distances, but at something pedestrian like 10 feet? I feel like that'd be doable.
Maybe it's a perceived resolution increase from the added detail such a rendered image might show, more than actually seeing the pixels? Also, maybe I should just listen to the scientists who've done the kind of testing I'd need to do to prove my point.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Jon Martindale is a contributing writer for Tom's Hardware. For the past 20 years, he's been writing about PC components, emerging technologies, and the latest software advances. His deep and broad journalistic experience gives him unique insights into the most exciting technology trends of today and tomorrow.
-
heffeque I can definitely see pixels on my 1080p 55" TV at a normal viewing distance, but I'm fairly sure than when I upgrade, it won't be anything farther than 4K.Reply
4K is already a bit overkill, so anything above 4K (for a normal sized TV on a normal viewing distance) is going to be nonsense.
Obviously 5K-8K monitors do make sense (much shorter viewing distance). -
Stomx What a bs. At that QHD 1440 resolution, 50 inch size and 10 feet distance the PPD 124 is twice the so called retina display pixel density, run the ppd calculator on the web. Of course no one will see any differences.Reply -
Zaranthos There are a lot of tests and studies on this and it gets very complicated and confusing. There are differences between what you see, what you perceive, and the speed you can see movement. At the end of looking into the complicated mess of what the human eye can and cannot see the basic conclusion is that modern display technology is not even close to what the human eye and brain can perceive despite the technical abilities of the human eye being pretty limited. It will still be a long time before GPU and display technology can exceed the capabilities of the human eye and human brain combination. It may still come down to sitting down and using both and ending with, I don't know this one just feels more real, while not being able to "see the difference".Reply -
bit_user Reply
More than 10 years ago, I already figured out that 10 feet was about the limit of how far back I could discern individual pixels on a 60" 1080p screen, with corrective lenses!The article said:we can see that with a 50-inch screen at 10 feet distance, the subjects of the study wouldn't have been able to tell the difference between a 1440p screen and one at 8K resolution.
However, there's a difference between distinguishing individual pixels and declaring that 1080p is "enough", at that distance. The difference is that screens don't behave like optimal reconstruction filters, so you can still get aliasing artifacts, where higher frequency signals can manifest partly as lower frequencies which are perceivable at a distance. Therefore, I maintain there's still some benefit to using resolutions > 1080p on a 60" screen for me, at 10 feet.
Caption: An example of a poorly sampled brick pattern, showing aliasing (i.e. Moire pattern) when sampling below the Nyquist limit. See full resolution, non-aliased original.
Source: https://en.wikipedia.org/wiki/Aliasing
Even so, I draw the limit at 4k. I don't foresee myself going to 8k at any distance. The only argument I could see for > 4k is in a truly wrap-around screen, like Sphere, where you need high-res everywhere, since you're not confining the field of view to normal screen proportions. In normal viewing contexts, 4k gives the content creators plenty of headroom to pre-filter high frequencies, without the image appearing overly soft.
That said, I get why content creators might want to film at 8k, because you need to start with a higher res source, prior to anti-aliasing. Furthermore, 8k provides some extra margin for cropping or zooming.
It's an easy experiment to try, even if you don't have access to 50" monitors of either resolution. Let's say your 1440p monitor is 27" and your 4k monitor is 32". Put the first at a distance of 5.4 feet (or 5 feet and 4.8 inches) and put the second display at 6.4 feet (or 6 feet and 4.8 inches). The conversions are trivial to compute, because scale changes as a linear function of distance. Both monitors should now fill the same area of your field of view.The article said:I still feel like I can tell the difference between them and 1440p. Maybe not at extreme distances, but at something pedestrian like 10 feet? I feel like that'd be doable.
After adjusting both monitors to have the same approximate brightness, take an 8k image and scale it down to each monitor's native resolution, using a high-quality filter, like lanczos*. Then, make it full-screen and see if you can discern any details on the 4k monitor you can't see on the 1440p one.
* To counter the aliasing artifacts I mentioned above, I'd filter it a bit more aggressively, but this gets you in the ballpark of what each monitor can optimally display.