The short answer is, no and yes. Some analysts will have you believe that “8K TV blows 4K away,” and that might suggest that you at least want a 4K TV. The reality, as it comes to electronics and perception, is more complicated.
One might assume that higher resolution always makes a picture better, because the pixels get smaller and smaller, to the point where you don’t see them anymore. But the human visual system — your eyes — has a finite capacity, and once you exceed this, any other “improvement” is wasted, because it just won’t be seen.
Here’s why (warning, geometry involved):
The term â20/20 visionâ is defined as the ability to just distinguish features that subtend one-arc-minute of angle (one-sixtieth of a degree). In other words, objects at a certain distance can only be resolved as separate objects if the objects are a certain distance apart.
Using trigonometry, this works out to be about 1/32″ as the smallest separation a person with 20/20 vision can see at a distance of ten feet. We can use the same math to show that the âoptimumâ distance from which to observe an HDÂ (1080-line) display (i.e., where a 20/20 observer can just resolve the pixels) is about 3 times the picture height.
On a 1080-line monitor with a 15â diagonal, this works out to an optimum viewing distance of just under two feet; with a 42â display, itâs about five-and-a-half feet. Sitting closer than this means the pixels will become visible; sitting further means that the resolution is “wasted.” Keep in mind, also, that most people sit about 9 feet away from the TV, what is sometimes called the “Lechner distance,” after a well-known TV systems researcher.
Of course, these numbers (and others produced by various respectable organizations) are based on subjective evaluation of the human visual system, and different observers will show different results, especially when the target applications vary. Nonetheless, the âthree picture heightsâ rule has survived critical scrutiny for several decades, and we haven’t seen a significant deviation in practice.
At 4K, the optimum distance becomes 1.6 picture-heights: at the same 1080-display viewing distance of 5.5 feet, one needs an 84â-diagonal display (7 feet), which is already available. For these reasons, some broadcasters believe that 4K is not a practical viewing format, since displaying 4K images would require viewing at 2.5 picture-heights to match normal human visual acuity.
At 8K, the numbers become absurd for the typical viewer: 0.7 picture heights, or a 195″ diagonal (16 feet) at a 5.5-foot distance. With a smaller display, or at a larger distance, the increased resolution is completely invisible to the viewer: that means wasted pixels (and money).  Because such a display is very large (and thus very expensive), the 105-degree viewing angle it would subtend at the above viewing distance approaches a truly immersive and lifelike experience for a viewer — but how many people would put such a beast in their home?
From a production perspective, 4K does make some sense, because an environment that captures all content in 4K, and then processes this content in a 1080p workflow for eventual distribution, will produce archived material at a very high intrinsic quality. Of course, there’s a cost associated with that, too.
But there are two other reasons why one might be persuaded to upgrade their HDTV: HDR (High Dynamic Range) and HFR (High Frame Rate). Briefly, HDR increases the dynamic range of video from about 6 stops (64:1) to more than 200,000:1 or 17.6 stops, making the detail and contrast appear closer to that of reality. HFR increases the frame rate from the currently-typical 24, 30 or 60 fps to 120 fps. And these other features make a much more recognizable improvement in pictures — at almost any level of eyesight. But that’s another story.
— agc