Scientists claim you can't see the difference between 1440p and 8K at 10 feet in new study on the limits of the human eye — would still be an improvement on the previously-touted upper limit of 60 pixels per degree
Measuring the pixel-per-degree of a display, researchers found the human eye can see up to 94 PPD for grey, but as few as 53 for yellow and violet.
Researchers at the University of Cambridge and Meta Reality Labs have conducted a new study into how the human eye perceives pixels on displays at different sizes and resolutions, and claim that once you get to a certain size and detail, there's no discernible difference, via TechXplore. According to the calculator they developed, at 10 feet distance, a 50-inch screen looks almost identical at 1440p and 8K resolution.
The researchers highlighted that as displays grow larger and more detailed, with ever greater resolutions, it's important to know what humans can actually see so that we aren't developing technologies that are largely redundant. But where previous studies have looked at the perceived retinal resolution, these researchers looked at what resolution the viewer could perceive with utmost clarity and no blur, indistinguishable from a perfect reference, and measured it distinctly for different colors and tones.
Using a sliding display to maintain continuous resolution control, researchers found that the human eye can perceive black and white pixels at up to 94 pixels per degree, and up to 89 pixels per degree for red and green patterns, but just 53 pixels per degree for yellow and violet.
The researchers looked into the effects of looking directly at a pixel or from an angle, the overall size of the screen, its pixel density, the brightness (or darkness) of a room, and the distance between the viewer and the screen.
In correlating all this data together, the researchers developed a calculator where you can input various factors like resolution, distance from the screen, and screen size, to see how the study would extrapolate the results. Using that data, we can see that with a 50-inch screen at 10 feet distance, the subjects of the study wouldn't have been able to tell the difference between a 1440p screen and one at 8K resolution. The calculator indicates that for a 50-inch 1440p display viewed at 10 feet, only 1 percent of the population would notice the difference between that image and a 'perfect' image. At 4K, that number becomes 0%; naturally, 8K would be the same. According to the scientists, all three resolutions would look broadly the same at that distance.
"If you have more pixels in your display, it's less efficient, it costs more and it requires more processing power to drive it," said co-author Professor Rafał Mantiuk, from Cambridge's Department of Computer Science and Technology.
"So we wanted to know the point at which it makes no sense to further improve the resolution of the display."
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Before this study, the human eye was thought to be capable of viewing 60 pixels per degree, so it raises the bar for what the human eye can perceive at certain distances and screen sizes. That could prompt display manufacturers to adjust their designs, and indeed, the researchers hope the results could help guide display development, as well as image, rendering, and video coding technologies in the future.
Although it feels like there's an argument to be made that 8K displays are still somewhat redundant when there's little native content for them or powerful enough GPUs to run them, I still feel like I can tell the difference between them and 1440p. Maybe not at extreme distances, but at something pedestrian like 10 feet? I feel like that'd be doable.
Maybe it's a perceived resolution increase from the added detail such a rendered image might show, more than actually seeing the pixels? Also, maybe I should just listen to the scientists who've done the kind of testing I'd need to do to prove my point.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Jon Martindale is a contributing writer for Tom's Hardware. For the past 20 years, he's been writing about PC components, emerging technologies, and the latest software advances. His deep and broad journalistic experience gives him unique insights into the most exciting technology trends of today and tomorrow.
-
heffeque I can definitely see pixels on my 1080p 55" TV at a normal viewing distance, but I'm fairly sure than when I upgrade, it won't be anything farther than 4K.Reply
4K is already a bit overkill, so anything above 4K (for a normal sized TV on a normal viewing distance) is going to be nonsense.
Obviously 5K-8K monitors do make sense (much shorter viewing distance). -
Stomx What a bs. At that QHD 1440 resolution, 50 inch size and 10 feet distance the PPD 124 is twice the so called retina display pixel density, run the ppd calculator on the web. Of course no one will see any differences.Reply -
Zaranthos There are a lot of tests and studies on this and it gets very complicated and confusing. There are differences between what you see, what you perceive, and the speed you can see movement. At the end of looking into the complicated mess of what the human eye can and cannot see the basic conclusion is that modern display technology is not even close to what the human eye and brain can perceive despite the technical abilities of the human eye being pretty limited. It will still be a long time before GPU and display technology can exceed the capabilities of the human eye and human brain combination. It may still come down to sitting down and using both and ending with, I don't know this one just feels more real, while not being able to "see the difference".Reply -
bit_user Reply
More than 10 years ago, I already figured out that 10 feet was about the limit of how far back I could discern individual pixels on a 60" 1080p screen, with corrective lenses!The article said:we can see that with a 50-inch screen at 10 feet distance, the subjects of the study wouldn't have been able to tell the difference between a 1440p screen and one at 8K resolution.
However, there's a difference between distinguishing individual pixels and declaring that 1080p is "enough", at that distance. The difference is that screens don't behave like optimal reconstruction filters, so you can still get aliasing artifacts, where higher frequency signals can manifest partly as lower frequencies which are perceivable at a distance. Therefore, I maintain there's still some benefit to using resolutions > 1080p on a 60" screen for me, at 10 feet.
Caption: An example of a poorly sampled brick pattern, showing aliasing (i.e. Moire pattern) when sampling below the Nyquist limit. See full resolution, non-aliased original.
Source: https://en.wikipedia.org/wiki/Aliasing
Even so, I draw the limit at 4k. I don't foresee myself going to 8k at any distance. The only argument I could see for > 4k is in a truly wrap-around screen, like Sphere, where you need high-res everywhere, since you're not confining the field of view to normal screen proportions. In normal viewing contexts, 4k gives the content creators plenty of headroom to pre-filter high frequencies, without the image appearing overly soft.
That said, I get why content creators might want to film at 8k, because you need to start with a higher res source, prior to anti-aliasing. Furthermore, 8k provides some extra margin for cropping or zooming.
It's an easy experiment to try, even if you don't have access to 50" monitors of either resolution. Let's say your 1440p monitor is 27" and your 4k monitor is 32". Put the first at a distance of 5.4 feet (or 5 feet and 4.8 inches) and put the second display at 6.4 feet (or 6 feet and 4.8 inches). The conversions are trivial to compute, because scale changes as a linear function of distance. Both monitors should now fill the same area of your field of view.The article said:I still feel like I can tell the difference between them and 1440p. Maybe not at extreme distances, but at something pedestrian like 10 feet? I feel like that'd be doable.
After adjusting both monitors to have the same approximate brightness, take an 8k image and scale it down to each monitor's native resolution, using a high-quality filter, like lanczos*. Then, make it full-screen and see if you can discern any details on the 4k monitor you can't see on the 1440p one.
* To counter the aliasing artifacts I mentioned above, I'd filter it a bit more aggressively, but this gets you in the ballpark of what each monitor can optimally display. -
ingtar33 this was well known... 10ft probably depends on the size of the screen, but there were graphs showing the human eyes's ability to discern various resolutions on various screens at various distances a decade+ ago.Reply
I think for computer screens it went something like this.
from 2.5 -3 ft... from your face.
24"-27" 1080p (max limit of the eye to discern, so higher resolutions on 24" screens won't be visible to the eye)
27"-36" 1440p
36"+ 4k
note, if you're further back the ability for the eye to tell the difference decreases, so for example if you were 6' away you probably couldn't tell the difference between 1440p and 1080p on a 32" screen
(note: all numbers are 'remembered' from 10+ years ago. my memory may be incorrect, but i think on the whole this is right) -
bit_user Reply
I disagree with this. At work, I spent many years looking at a 24" monitor that was 1920x1200 resolution. I would sit with my face between 24" and 30" away from it, and I could quite easily see the individual pixels.ingtar33 said:I think for computer screens it went something like this.
from 2.5 -3 ft... from your face.
24"-27" 1080p (max limit of the eye to discern, so higher resolutions on 24" screens won't be visible to the eye)
27"-36" 1440p
36"+ 4k
Likewise, with 2560x1440 at 27", I can easily see individual pixels.
Where the DPI starts to stretch my limits is 4k at 32". This is almost too much resolution for my eyes, at that size. However, a larger screen would either need to sit farther away (hence defeating the point) or would require me to move my head too much and do a lot of re-focusing, both of which are fatiguing. A curved screen would help with the re-foucsing part, but the head-movement would probably still be too much for me.
That said, I'm old-school, in that I like to actually use every pixel. So, I set my editor windows to use fonts with the fewest number of pixels that don't impact legibility on even lower-DPI displays. Yeah, I could just use larger fonts on a 4k monitor, but that would partly defeat the point for me. -
redgarl It is looking like some of these scientists need a glass prescription. If you can't see the difference, then you are obviously blind.... especially with screens averaging 77 inches.Reply -
usertests 50-inch would be small for an 8K TV. I assume most (in the small % of homes that have them) are between 65 and 100 inches.Reply
The resolution exists at smaller sizes, but it will (or should) be closer to your face.
Where's my 9.6-inch 8K tablet?! Or 8.3-inch... -
bit_user Reply
True, the size did seem odd, if we're really talking about TVs.usertests said:50-inch would be small for an 8K TV. I assume most (in the small % of homes that have them) are between 65 and 100 inches.
Of course, the key thing isn't just the size in abstract, but really how much of your field of vision it's occupying.
I'm a lot more interested in 120 Hz content than I am in 8k.usertests said:Where's my 9.6-inch 8K tablet?! Or 8.3-inch... -
JarredWaltonGPU Given that there was a study conducted, what I'd really like to know is how the same people from the study did in blind testing at a distance of 10 feet with identifying which display looked 'better' given two options. This should be relatively simple to do.Reply
Hide the border of the displays and have people sit on a couch ten feet from the display. Have them pick whether A or B looks better, or if they look the same. Rinse, lather, repeat with 10 pairs of displays or whatever and collect the data. Or have them rank ten displays from best to worse.
But that's not what the study did. It used a high-end 27-inch Eizo ColorEdge CS2740 4K monitor that was moved toward or away from the participants along a 1.4m track. Which means what, exactly? Are we talking about the human eye differentiating between a 27-inch 1440p and 8K (or even 4K) display at ten feet? If so, that's dumb. What I want to know is what it means for a typical 65-inch TV, not a 27-inch monitor.
And I get that these things start to become subjective, but that's precisely what I'm interested in seeing tested. Here we have the science saying one thing, that human eyes can't tell the difference between 1440p and 8K at 10 feet. Fine. Now go get a bunch of displays, all of them 65-inches, and line them up in a dark room with users sitting 10 feet back (and a barrier so they can't get closer... or maybe have the seat move up and down the line). Have them rank the displays, randomize the order, etc.
Oh, and do this for way more than 18 people. That's a trivially small number of participants. I'd like to see 100 doing a test to determine how they rate various 1080p, 1440p, 4K, and 8K displays from ten feet away. I suspect the 4K and 8K will end up ranking higher, even if science claims our eyes can't see the difference.
(* Caveat: Obviously, the panels used are going to matter, and getting equivalent quality 1080p, 1440p, 4K, and 8K displays is difficult / impossible.)
This all reminds me of the "science says the human eye can't see more than 18 FPS" or whatever the current claims are. There's a whole bunch of science mumbo-jumbo done to conclude that we really don't need high FPS content, but if you sit me in front of a game running at 30 FPS, 60 FPS, and 120 FPS where I'm playing the game, I am confident I can pick those three out. Now, beyond 120 FPS? Yeah, I'd struggle, but the super low FPS claims (less than 20) only hold water when dealing with analog film capture for a movie, and not even fully there.