sybersitizen
Forum Pro
Participation in this is voluntary. If you do not like polls and/or do not want to participate in this one, ignore it.
If you do want to participate ...
This thread is not a platform for argument. If you post something to try turn it into an argument, I'll report your post for removal.
So ...
I've seen plenty of threads where people talk about how well they can see the visible resolution difference between 4K and 2K. Talk is cheap. It would be nice to have a tangible reference and factual reporting on this for a change.
I built a test image, available below. This test is strictly about resolution difference and viewing distance for still images. It's not about color gamut or frame rates or refresh rates or compression or anything else.
The upper left quadrant is a straight 1920x1080 crop from a scene I photographed over the summer. The other three quadrants are that same crop, but downsampled in Photoshop to 960x540, then upsampled back to 1920x1080. I used three different algorithms for the resampling: Bicubic (best for smooth gradients), Bicubic Sharper/Smoother (best for reduction/enlargement), and Nearest Neighbor (preserve hard edges).
Testing different resampling algorithms is important because many media devices do their own interpolation while others might not. For now, we can confine the test to comparing the best and worst quadrants - Nearest Neighbor being the worst by far.
When the combined image is displayed on a 4K monitor or TV, only the upper left quadrant is getting the benefit of 4K. The resolution of the other quadrants has been reduced to the equivalent of HD (2K) content.
When I look at this on my own 40" 4K TV, I have to be within just a few feet of it in order to see any difference between the good upper left quadrant and the horrible lower right quadrant. The other two quadrants are in between, and I would have to be even closer to see their differences.
I have average vision (with glasses), but my daughter has better than 20/15 vision. I asked her to do the same evaluation of this image (she is a Photoshop-savvy photographer and technology worker, so she knows how to look at displayed images critically). Her observation was the same as mine. She has to be within a few feet of the screen to see any difference. In our normal viewing positions of 10 feet from the screen, it's absolutely impossible for her to see a difference.
In our cases, our viewing distance must be approximately 1.5x the diagonal measurement (40" x 1.5 = 60" = 5ft) or of the display or closer in order to see the difference between the best and worst quadrants. At 2X times the diagonal measurement (80" or 6.7 feet) or farther, we can no longer see the difference.
Now the poll question: At what distances can other readers here see the resolution difference in this test image with their 4K displays of various sizes?
(I marked my answer #4: 1.5x the diagonal measurement or closer.)
View attachment 2139815
4K Test
If you do want to participate ...
- You'll need to have a 4K display device
- You'll need to know its diagonal dimension
- You'll need to display a 4K JPEG on it
- You'll need to look at the JPEG from different measured viewing distances and evaluate what you can see
This thread is not a platform for argument. If you post something to try turn it into an argument, I'll report your post for removal.
So ...
I've seen plenty of threads where people talk about how well they can see the visible resolution difference between 4K and 2K. Talk is cheap. It would be nice to have a tangible reference and factual reporting on this for a change.
I built a test image, available below. This test is strictly about resolution difference and viewing distance for still images. It's not about color gamut or frame rates or refresh rates or compression or anything else.
The upper left quadrant is a straight 1920x1080 crop from a scene I photographed over the summer. The other three quadrants are that same crop, but downsampled in Photoshop to 960x540, then upsampled back to 1920x1080. I used three different algorithms for the resampling: Bicubic (best for smooth gradients), Bicubic Sharper/Smoother (best for reduction/enlargement), and Nearest Neighbor (preserve hard edges).
Testing different resampling algorithms is important because many media devices do their own interpolation while others might not. For now, we can confine the test to comparing the best and worst quadrants - Nearest Neighbor being the worst by far.
When the combined image is displayed on a 4K monitor or TV, only the upper left quadrant is getting the benefit of 4K. The resolution of the other quadrants has been reduced to the equivalent of HD (2K) content.
When I look at this on my own 40" 4K TV, I have to be within just a few feet of it in order to see any difference between the good upper left quadrant and the horrible lower right quadrant. The other two quadrants are in between, and I would have to be even closer to see their differences.
I have average vision (with glasses), but my daughter has better than 20/15 vision. I asked her to do the same evaluation of this image (she is a Photoshop-savvy photographer and technology worker, so she knows how to look at displayed images critically). Her observation was the same as mine. She has to be within a few feet of the screen to see any difference. In our normal viewing positions of 10 feet from the screen, it's absolutely impossible for her to see a difference.
In our cases, our viewing distance must be approximately 1.5x the diagonal measurement (40" x 1.5 = 60" = 5ft) or of the display or closer in order to see the difference between the best and worst quadrants. At 2X times the diagonal measurement (80" or 6.7 feet) or farther, we can no longer see the difference.
Now the poll question: At what distances can other readers here see the resolution difference in this test image with their 4K displays of various sizes?
(I marked my answer #4: 1.5x the diagonal measurement or closer.)
View attachment 2139815
4K Test
