
Before this study, the human eye was thought to be capable of viewing 60 pixels per degree, so it raises the bar for what the human eye can perceive at certain distances and screen sizes. That could prompt display manufacturers to adjust their designs, and indeed, the researchers hope the results could help guide display development, as well as image, rendering, and video coding technologies in the future.
Although it feels like there's an argument to be made that 8K displays are still somewhat redundant when there's little native content for them or powerful enough GPUs to run them, I still feel like I can tell the difference between them and 1440p. Maybe not at extreme distances, but at something pedestrian like 10 feet? I feel like that'd be doable.
Maybe it's a perceived resolution increase from the added detail such a rendered image might show, more than actually seeing the pixels? Also, maybe I should just listen to the scientists who've done the kind of testing I'd need to do to prove my point.
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Jon Martindale Freelance Writer Jon Martindale is a contributing writer for Tom's Hardware. For the past 20 years, he's been writing about PC components, emerging technologies, and the latest software advances. His deep and broad journalistic experience gives him unique insights into the most exciting technology trends of today and tomorrow.
heffeque I can definitely see pixels on my 1080p 55" TV at a normal viewing distance, but I'm fairly sure than when I upgrade, it won't be anything farther than 4K. 4K is already a bit overkill, so anything above 4K (for a normal sized TV on a normal viewing distance) is going to be nonsense. Obviously 5K-8K monitors do make sense (much shorter viewing distance). Reply
Stomx What a bs. At that QHD 1440 resolution, 50 inch size and 10 feet distance the PPD 124 is twice the so called retina display pixel density, run the ppd calculator on the web. Of course no one will see any differences. Reply
Zaranthos There are a lot of tests and studies on this and it gets very complicated and confusing. There are differences between what you see, what you perceive, and the speed you can see movement. At the end of looking into the complicated mess of what the human eye can and cannot see the basic conclusion is that modern display technology is not even close to what the human eye and brain can perceive despite the technical abilities of the human eye being pretty limited. It will still be a long time before GPU and display technology can exceed the capabilities of the human eye and human brain combination. It may still come down to sitting down and using both and ending with, I don't know this one just feels more real, while not being able to "see the difference". Reply
bit_user The article said: we can see that with a 50-inch screen at 10 feet distance, the subjects of the study wouldn't have been able to tell the difference between a 1440p screen and one at 8K resolution. More than 10 years ago, I already figured out that 10 feet was about the limit of how far back I could discern individual pixels on a 60" 1080p screen, with corrective lenses! However, there's a difference between distinguishing individual pixels and declaring that 1080p is "enough", at that distance. The difference is that screens don't behave like optimal reconstruction filters, so you can still get aliasing artifacts, where higher frequency signals can manifest partly as lower frequencies which are perceivable at a distance. Therefore, I maintain there's still some benefit to using resolutions > 1080p on a 60" screen for me, at 10 feet. Caption: An example of a poorly sampled brick pattern, showing aliasing (i.e. Moire pattern) when sampling below the Nyquist limit. See full resolution, non-aliased original. Source: https://en.wikipedia.org/wiki/Aliasing Even so, I draw the limit at 4k. I don't foresee myself going to 8k at any distance. The only argument I could see for > 4k is in a truly wrap-around screen, like Sphere, where you need high-res everywhere , since you're not confining the field of view to normal screen proportions. In normal viewing contexts, 4k gives the content creators plenty of headroom to pre-filter high frequencies, without the image appearing overly soft. That said, I get why content creators might want to film at 8k, because you need to start with a higher res source, prior to anti-aliasing. Furthermore, 8k provides some extra margin for cropping or zooming. The article said: I still feel like I can tell the difference between them and 1440p. Maybe not at extreme distances, but at something pedestrian like 10 feet? I feel like that'd be doable. It's an easy experiment to try, even if you don't have access to 50" monitors of either resolution. Let's say your 1440p monitor is 27" and your 4k monitor is 32". Put the first at a distance of 5.4 feet (or 5 feet and 4.8 inches) and put the second display at 6.4 feet (or 6 feet and 4.8 inches). The conversions are trivial to compute, because scale changes as a linear function of distance. Both monitors should now fill the same area of your field of view. After adjusting both monitors to have the same approximate brightness, take an 8k image and scale it down to each monitor's native resolution, using a high-quality filter, like lanczos*. Then, make it full-screen and see if you can discern any details on the 4k monitor you can't see on the 1440p one. * To counter the aliasing artifacts I mentioned above, I'd filter it a bit more aggressively, but this gets you in the ballpark of what each monitor can optimally display. Reply
ingtar33 this was well known… 10ft probably depends on the size of the screen, but there were graphs showing the human eyes's ability to discern various resolutions on various screens at various distances a decade+ ago. I think for computer screens it went something like this. from 2.5 -3 ft… from your face. 24"-27" 1080p (max limit of the eye to discern, so higher resolutions on 24" screens won't be visible to the eye) 27"-36" 1440p 36"+ 4k note, if you're further back the ability for the eye to tell the difference decreases, so for example if you were 6' away you probably couldn't tell the difference between 1440p and 1080p on a 32" screen (note: all numbers are 'remembered' from 10+ years ago. my memory may be incorrect, but i think on the whole this is right) Reply
bit_user ingtar33 said: I think for computer screens it went something like this. from 2.5 -3 ft… from your face. 24"-27" 1080p (max limit of the eye to discern, so higher resolutions on 24" screens won't be visible to the eye) 27"-36" 1440p 36"+ 4k I disagree with this. At work, I spent many years looking at a 24" monitor that was 1920×1200 resolution. I would sit with my face between 24" and 30" away from it, and I could quite easily see the individual pixels. Likewise, with 2560×1440 at 27", I can easily see individual pixels. Where the DPI starts to stretch my limits is 4k at 32". This is almost too much resolution for my eyes, at that size. However, a larger screen would either need to sit farther away (hence defeating the point) or would require me to move my head too much and do a lot of re-focusing, both of which are fatiguing. A curved screen would help with the re-foucsing part, but the head-movement would probably still be too much for me. That said, I'm old-school, in that I like to actually use every pixel. So, I set my editor windows to use fonts with the fewest number of pixels that don't impact legibility on even lower-DPI displays. Yeah, I could just use larger fonts on a 4k monitor, but that would partly defeat the point for me. Reply
redgarl It is looking like some of these scientists need a glass prescription. If you can't see the difference, then you are obviously blind…. especially with screens averaging 77 inches. Reply
JarredWaltonGPU Given that there was a study conducted, what I'd really like to know is how the same people from the study did in blind testing at a distance of 10 feet with identifying which display looked 'better' given two options. This should be relatively simple to do. Hide the border of the displays and have people sit on a couch ten feet from the display. Have them pick whether A or B looks better, or if they look the same. Rinse, lather, repeat with 10 pairs of displays or whatever and collect the data. Or have them rank ten displays from best to worse. But that's not what the study did. It used a high-end 27-inch Eizo ColorEdge CS2740 4K monitor that was moved toward or away from the participants along a 1.4m track. Which means what, exactly? Are we talking about the human eye differentiating between a 27-inch 1440p and 8K (or even 4K) display at ten feet? If so, that's dumb. What I want to know is what it means for a typical 65-inch TV, not a 27-inch monitor. And I get that these things start to become subjective, but that's precisely what I'm interested in seeing tested. Here we have the science saying one thing, that human eyes can't tell the difference between 1440p and 8K at 10 feet. Fine. Now go get a bunch of displays, all of them 65-inches, and line them up in a dark room with users sitting 10 feet back (and a barrier so they can't get closer… or maybe have the seat move up and down the line). Have them rank the displays, randomize the order, etc. Oh, and do this for way more than 18 people. That's a trivially small number of participants. I'd like to see 100 doing a test to determine how they rate various 1080p, 1440p, 4K, and 8K displays from ten feet away. I suspect the 4K and 8K will end up ranking higher, even if science claims our eyes can't see the difference. (* Caveat: Obviously, the panels used are going to matter, and getting equivalent quality 1080p, 1440p, 4K, and 8K displays is difficult / impossible.) This all reminds me of the "science says the human eye can't see more than 18 FPS" or whatever the current claims are. There's a whole bunch of science mumbo-jumbo done to conclude that we really don't need high FPS content, but if you sit me in front of a game running at 30 FPS, 60 FPS, and 120 FPS where I'm playing the game, I am confident I can pick those three out. Now, beyond 120 FPS? Yeah, I'd struggle, but the super low FPS claims (less than 20) only hold water when dealing with analog film capture for a movie, and not even fully there. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/monitors/SPONSORED_LINK_URL
- https://www.tomshardware.com/monitors/scientists-claim-you-cant-see-the-difference-between-1440p-and-8k-at-10-feet-in-new-study-on-the-limits-of-the-human-eye-would-still-be-an-improvement-on-the-previously-touted-upper-limit-of-60-pixels-per-degree#main
- https://www.tomshardware.com
- SanDisk Extreme Pro With USB4 (2TB) review: Bursty speed, but not great for pros
- NVIDIA Launches Open Models and Data to Accelerate AI Innovation Across Language, Biology and Robotics
- Musk says Samsung's Texas fab outclasses TSMC's US-based fabs — with AI5 still in development, questions remain over whether Tesla will need advanced tools
- Open Source AI Week — How Developers and Contributors Are Advancing AI Innovation
- NVIDIA Launches Open Models and Data to Accelerate AI Innovation Across Language, Biology and Robotics
Informational only. No financial advice. Do your own research.