Summary
-
A study by the University of Cambridge and Meta suggests that the difference in sharpness between 4K and 8K is imperceptible to the human eye under typical conditions of use.
-
The experiment involved 18 participants and showed that the average perception exceeds the 20/20 standard, but without practical gains at very high resolutions.
-
The researchers also created an online calculator to estimate the limit of visual perception, reinforcing that 8K rarely offers real benefits.
Is the investment in an 8K TV really worth it? A recent study, conducted by scientists at the University of Cambridge, in the United Kingdom, suggests not. Research indicates that for the typical living room viewing distance, 4K or 8K screens do not offer noticeable benefits in sharpness when compared to a similarly sized 2K screen.
The study was carried out in collaboration with Meta and published in the journal Nature Communicationsand sought to determine the real limit that the human eye can distinguish. Although normal vision (20/20) implies the ability to differentiate 60 pixels per degree (PPD), researchers argue that most people see details beyond that.
It is important to note, however, that the research was carried out with a limited number of just 18 participants, focusing specifically on the perception of sharpness in relation to resolution.
How was the experiment?


To test this capability, the team used a 27-inch 4K monitor mounted on a mobile stand. The 18 participants, with normal (or corrected) vision, were positioned at different distances from the screen. In each position, two types of image were randomly displayed:
- One with thin vertical lines (in black and white, red and green, or yellow and violet)
- Another completely gray
Participants needed to indicate which image contained the lines. “When they become too thin or the screen resolution too high, the pattern looks no different to a simple gray image,” explains Dr. Maliha Ashraf from the University of Cambridge, author of the study. “We measured the point where people could barely tell them apart. That’s what we call the resolution limit.”
The results showed that the human eye can perceive more detail than the 20/20 standard suggests. The average perceived resolution was 94 PPD for grayscale images viewed from the front. For red and green patterns, the average was 89 PPD, falling to 53 PPD for yellow and violet patterns.
In a second experiment with 12 participants, white texts on a black background (and vice versa) were displayed at different distances. Volunteers indicated when the text looked as sharp as a perfectly focused reference version.
“The resolution at which people stopped noticing differences in the text coincided with what we saw in the line patterns,” says Ashraf.
Calculator measures perception of resolution


Using the research data, the developers created a graph and a free online calculator, available on the laboratory’s website. It allows the user to input viewing distance, screen size, and screen resolution to see if the current setting is above or below most people’s visual perception threshold.
The conclusion, according to scientists, is that many domestic devices already exceed what the human eye can process. “If someone already has a 44-inch 4K TV and watches it from about 2.5 meters away, this is more detail than what the eye can see”, explains Ashraf. “Upgrading to an 8K version of the same size wouldn’t make it any sharper.”
The researcher also suggests that, after a certain point, adding more pixels becomes a “waste”, as the human eye simply cannot detect them.
Source: https://tecnoblog.net/noticias/tvs-4k-ou-8k-nao-fazem-a-menor-diferenca-diz-estudo/
