Scientists claim you can't see the difference between 1440p and 8K at 10 feet in new study on the limits of the human eye — would still be an improvement on the previously-touted upper limit of 60 pixels per degree

Girl undergoing an eye exam.
(Image credit: Getty Images/bluecinema)

Researchers at the University of Cambridge and Meta Reality Labs have conducted a new study into how the human eye perceives pixels on displays at different sizes and resolutions, and claim that once you get to a certain size and detail, there's no discernible difference, via TechXplore. According to the calculator they developed, at 10 feet distance, a 50-inch screen looks almost identical at 1440p and 8K resolution.

The researchers highlighted that as displays grow larger and more detailed, with ever greater resolutions, it's important to know what humans can actually see so that we aren't developing technologies that are largely redundant. But where previous studies have looked at the perceived retinal resolution, these researchers looked at what resolution the viewer could perceive with utmost clarity and no blur, indistinguishable from a perfect reference, and measured it distinctly for different colors and tones.

Although it feels like there's an argument to be made that 8K displays are still somewhat redundant when there's little native content for them or powerful enough GPUs to run them, I still feel like I can tell the difference between them and 1440p. Maybe not at extreme distances, but at something pedestrian like 10 feet? I feel like that'd be doable.

Maybe it's a perceived resolution increase from the added detail such a rendered image might show, more than actually seeing the pixels? Also, maybe I should just listen to the scientists who've done the kind of testing I'd need to do to prove my point.

Google Preferred Source

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Jon Martindale
Freelance Writer

Jon Martindale is a contributing writer for Tom's Hardware. For the past 20 years, he's been writing about PC components, emerging technologies, and the latest software advances. His deep and broad journalistic experience gives him unique insights into the most exciting technology trends of today and tomorrow.

  • heffeque
    I can definitely see pixels on my 1080p 55" TV at a normal viewing distance, but I'm fairly sure than when I upgrade, it won't be anything farther than 4K.

    4K is already a bit overkill, so anything above 4K (for a normal sized TV on a normal viewing distance) is going to be nonsense.

    Obviously 5K-8K monitors do make sense (much shorter viewing distance).
    Reply
  • Stomx
    What a bs. At that QHD 1440 resolution, 50 inch size and 10 feet distance the PPD 124 is twice the so called retina display pixel density, run the ppd calculator on the web. Of course no one will see any differences.
    Reply
  • Zaranthos
    There are a lot of tests and studies on this and it gets very complicated and confusing. There are differences between what you see, what you perceive, and the speed you can see movement. At the end of looking into the complicated mess of what the human eye can and cannot see the basic conclusion is that modern display technology is not even close to what the human eye and brain can perceive despite the technical abilities of the human eye being pretty limited. It will still be a long time before GPU and display technology can exceed the capabilities of the human eye and human brain combination. It may still come down to sitting down and using both and ending with, I don't know this one just feels more real, while not being able to "see the difference".
    Reply
  • bit_user
    The article said:
    we can see that with a 50-inch screen at 10 feet distance, the subjects of the study wouldn't have been able to tell the difference between a 1440p screen and one at 8K resolution.
    More than 10 years ago, I already figured out that 10 feet was about the limit of how far back I could discern individual pixels on a 60" 1080p screen, with corrective lenses!

    However, there's a difference between distinguishing individual pixels and declaring that 1080p is "enough", at that distance. The difference is that screens don't behave like optimal reconstruction filters, so you can still get aliasing artifacts, where higher frequency signals can manifest partly as lower frequencies which are perceivable at a distance. Therefore, I maintain there's still some benefit to using resolutions > 1080p on a 60" screen for me, at 10 feet.
    Caption: An example of a poorly sampled brick pattern, showing aliasing (i.e. Moire pattern) when sampling below the Nyquist limit. See full resolution, non-aliased original.

    Source: https://en.wikipedia.org/wiki/Aliasing
    Even so, I draw the limit at 4k. I don't foresee myself going to 8k at any distance. The only argument I could see for > 4k is in a truly wrap-around screen, like Sphere, where you need high-res everywhere, since you're not confining the field of view to normal screen proportions. In normal viewing contexts, 4k gives the content creators plenty of headroom to pre-filter high frequencies, without the image appearing overly soft.

    That said, I get why content creators might want to film at 8k, because you need to start with a higher res source, prior to anti-aliasing. Furthermore, 8k provides some extra margin for cropping or zooming.

    The article said:
    I still feel like I can tell the difference between them and 1440p. Maybe not at extreme distances, but at something pedestrian like 10 feet? I feel like that'd be doable.
    It's an easy experiment to try, even if you don't have access to 50" monitors of either resolution. Let's say your 1440p monitor is 27" and your 4k monitor is 32". Put the first at a distance of 5.4 feet (or 5 feet and 4.8 inches) and put the second display at 6.4 feet (or 6 feet and 4.8 inches). The conversions are trivial to compute, because scale changes as a linear function of distance. Both monitors should now fill the same area of your field of view.

    After adjusting both monitors to have the same approximate brightness, take an 8k image and scale it down to each monitor's native resolution, using a high-quality filter, like lanczos*. Then, make it full-screen and see if you can discern any details on the 4k monitor you can't see on the 1440p one.

    * To counter the aliasing artifacts I mentioned above, I'd filter it a bit more aggressively, but this gets you in the ballpark of what each monitor can optimally display.
    Reply