You're right; it ain't simple.No doubt it's a complex issue lacking a simplistic answer. Do you have a recommended alternative value to use for simplification of considerations of pictorial resolution?As I've said several times before, the half-minute limit is routinely shattered by normal people doing other than character recognition. For an extreme example, there's vernier acuity.I need to amend my answer a bit. Actually, when an image of 4:3 or 3:2 proportions is viewed from a distance 1.5x the image diagonal, a '6K' monitor is already beyond the limits of human vision.We can nibble at the edges of what 'normal' human vision can and cannot see, but if you view the whole image at one time, from a distance where you can comfortably see all of it, then the limit of what you can see is somewhere roughly in the vicinity of 24 MP. But it's diminishing returns as you approach that, depending on how good (or how well corrected) your personal vision is, image contrast, etc.I think it will. The problem with 8K is going to be content. GFX files provide plenty. You sound like the guys who tried to tell me that 6K would not blow 4K out of the water. But it does with my GFX files. I won't argue about the big screen movies, video or TV, which is mostly what you are talking about. I'm sure everything you said about all of that is true. I'm talking about for my photography.No, it won't.I don't care about 14 vs 16 bit on my raw files, and I can't tell any difference, so I just shoot 14.
But I can't wait to get my first 8K monitor. That is going to be so awesome....
I wonder if moving from 6K to 8K is going to be as stunning as moving from 4K to 6K was (when viewing GFX files)?
I like HDR. It is amazing, and my Dell 6K monitor does not even have great HDR specs.You'll see a lot more difference with HDR than with more resolution.
6K blows you away when viewing GFX files. I don't know what you are looking at, but I know what I am looking at. 4K to 6K has been amazing when working with and viewing GFX files.Until you get to video wall size, resolution beyond 4K isn't that noticeable, but color edition is.
I want 8K and I want all the HDR I can get.
A so-called 8K display--not really that, presumably 2x UHD--is 7680x4320 pixels. Even if you view whole-screen with no menu bar or whatever, then you're limited to a bit under 25 MP with a 4:3 aspect ratio, or a hair under 28 MP at 3:2. And the so-called 6K Dell display is 6144x3456 pixels, i.e., not quite 16 MP at 4:3 or not quite 18 MP at 3:2. So going from '6K' to '8K' takes you from pretty close to the limits of vision to a bit past it, FWIW. Again, this is for sitting back far enough to comfortably take in the whole image at one time.
Of course, if you don't need or want to comfortably take in the whole image at one time, then you can look at some part of it at 100%, i.e., one capture pixel corresponds to one screen pixel. That option exists regardless of whether your screen is a 13" VGA or a 32" '8K'.
My 24 MP figure was for sensor pixels with a 3:2 Bayer sensor with an anti-alias filter. The working assumption there is that effective resolution is only about 70% of the pixel count.
To elaborate on the analysis / math, if we take the limit of human vision as 0.5 minute-of-angle (i.e., 1/120 of a degree), then at a viewing distance of 1.5x the image diagonal, you can discern a picture vertical resolution of about 2750 pixels. By that measure, '6K' is already beyond the limit of human vision when viewed from a distance to take it all in. Most so-called 32" monitors are 80 cm diagonal, and at 16:9 that means the displayed image height is 15.4" and a 4:3 image is 20.6" wide and the diagonal is 25.7". So with a viewing distance of 38.6" to comfortably take in the whole thing, 'normal' vision can only see about 178 ppi. That works out to 3667x2750 pixels = 10 MP display pixels.
Obviously all of this involves some simplifications and judgment calls.
The reasons to use a higher-resolution camera of course include that you want to pixel-peep details at 100% and/or make large prints that you can view closely--closing in to see only a small part of the image.
Certainly trying to characterize something as complex as human vision (or hearing or ...) in simplistic terms presents a lot of problems. I don't doubt that many of us can see something past the 0.5 MOA level. IMO how much that contributes to photo viewing is a trickier thing. Having just had my periodic eye exam, the point where I have trouble determining which of 26 well-known characters, presented in quite high contrast, I'm actually seeing strikes me as a decent (if somewhat simplistic) metric for how much pictorial detail I'm likely to see.
For printer materials, my frame of reference is that 300 ppi B&W laser printers created output that, when viewed closely, looked great but not perfect. When we switched over to 600 ppi laser printers, I could see some improvements, e.g. in smaller-size italic characters. Past 600 ppi I haven't seen improvements. Obviously that's for very high contrast detail, albeit probably not printed with perfect precision.
But I have not engaged in any systematic study of visible resolution. I'd be interested to read something of moderate length and depth addressing scientific testing on this issue. Any recommendations?
[ETA]
For those who may be less familiar, to give an additional frame of reference on what 0.5 MOA means, that's about 0.52" at 100 yds or 15mm at 100m.
Here's something worth reading:
The Clinical Use of Vernier Acuity: Resolution of the Visual Cortex Is More Than Meets the Eye - PMC
Vernier acuity measures the ability to detect a misalignment or positional offset between visual stimuli, for example between two vertical lines when reading a vernier scale. It is considered a form of visual hyperacuity due to its detectable ...
BTW, I remember a psychologist at an SPIE meeting presenting a paper saying that areal cone density in humans can vary over a 36:1 range (6:1 linearly). That's a pretty broad range. When I was young, my corrected vision was 20/12. Now it's about 20/15 on a good day.