The EVF has a native refresh rate of 60fps, so this may be why you are seeing the least lag time here. Have you measured the lag in the HDMI output? Would be interesting to note what this is.
Cheers
Yes, but a 60fps display should have no issues in displaying 24fps (or 30fps) with little-to-no lag. So I think this lag is related to:
- sensor readout modes
- image processing
- display rendering
Most sensors have preset read modes, and they may speed up significantly by changing bit depth, line skipping, etc.
A small digression I am separating, to provide an example:
Here's one example (purely as an example, probably not the Z6 sensor) of a Sony, 24MP, full-frame sensor's read out modes and max frame rates per mode:
So, again for example, changing to 12-bit full readout mode (readout mode 1) doubles frame-rate compared to 14-bit. Line skipping (readout mode 21) again doubles this frame rate of not skipping (readout mode 1). Cropped 4K also increases readout speeds. Etc.
I believe the Z6 sensor is about twice as fast as this Sony sensor--DPReview claims the Z6's sensor is roughly twice as fast as the A7III sensor:
At this point, we only have a few data points.
Jim Kasson clocked the Z6 eshutter in 14-bit at around 1/22 seconds, and the 12-bit @ 1/37 seconds. My guess is that this could actually be 1/24 sec and 1/36 sec.
(This doesn't necessarily relate to the video readout modes, but this should be the "highest quality" readouts).
We've also gotten some hints about video readout modes. For example, in this interview, where Atomos talks about raw video feeds, including 12-bit raw 4K (or more):
And we also know from press material that the Z6's 4K is oversampled footage.
All in all, I think the lag is primarily due to different read out modes, and some may not be as easy to spot as others. My guess is that Nikon is probably using different bit depths and sampling methods between the modes--probably sacrificing lag for image quality. Bit depths could be 14-bit in 1080/24, and only 12-bit in 4K/24, for example.
Also, Nikon probably changes from oversampling to pixel binning at 1080/60, and then to line-skipping by 1080/120.
So one possible scenario for my tested modes is:
- 4K/24 = 12-bit oversampled
- 1080/120 = 12-bit line-skipping
- 1080/60= 12-bit pixel binning
- 1080/24 = 14-bit oversampled
And I would further guess that the slow-motion modes (eg. 1080/30 x4) enable line skipping, since they have visibly less viewfinder lag, but also exhibit visibly more noise.
I don't understand how their 1080/50 works though (but nor do I really care--I don't use this mode). They're probably pixel binning or line skipping, but the increased lag relative to 60 fps suggests they're doing something deeper than the 12-bit pixel binning.
Perhaps someone with more expertise than me can chime in.