Z6 4K 60p

Tavarino

Leading Member
Messages
705
Solutions
4
Reaction score
454
Is it pure wishful thinking or does anyone with expertise in recording codecs and camera image processors think that the upcoming support for CFexpress cards might make 4K 60p feasible on the Z6? I guess I’m hoping that increased write speeds might add some benefits for video.
 
Is it pure wishful thinking or does anyone with expertise in recording codecs and camera image processors think that the upcoming support for CFexpress cards might make 4K 60p feasible on the Z6? I guess I’m hoping that increased write speeds might add some benefits for video.
The lack of 4K60 probably has nothing to do with the card type. It's likely due to the sensor readout modes or camera processing bandwidth, even before anything is written to a card. 4K60 also doesn't tell much about the card speed required--that would be the bitrate.

For comparison, the Fuji XT3 does 4K60, but uses SD (UHS-II) cards, that are slower than the XQD cards already found in the Z6.
 
4K 60p is not in the cards, both literally and figuratively :)

The Z6's full-sensor readout speed is just fast enough to support full-sensor 4K sampling at 30fps. Theoretically it could do 60fps @ 4k but that would require sensor row line skipping, which would make the detail loss at 4k too great to be useful.
 
Last edited:
The HD 1080p6 footage recorded on the Z6, is good enough to upres to 4K in post to cut in with 4K footage as slowmo. Yes, 4K/UHD 60fps would be nice, but the files would be at least twice the size also.
Cheers
 
Do you know if the 1080p from the Z6 uses line skipping/binning, and at which framerates?
 
I just bought a Samsung Galaxy Note 9. It does 4K 60p. I really wish the camera companies would move to the forefront of imaging technology, their sole purpose, rather than trailing behind.
 
Do you know if the 1080p from the Z6 uses line skipping/binning, and at which framerates?
I've observed that there is a definite change in viewfinder lag between 50 fps and 60 fps; and that there is a noticeable change in IQ at 120 fps. I assume that 120 fps is doing some form of line skipping, and some different processing at 60 fps, though with negligible IQ reduction at 60 fps.

My rule of thumb based on my observations: at 1080, I'll shoot 60 fps. 4k, at 24 fps--if I'm doing 24 fps, I don't see any benefits at going down to 1080. And slow motion at 120 fps.
 
I just bought a Samsung Galaxy Note 9. It does 4K 60p. I really wish the camera companies would move to the forefront of imaging technology, their sole purpose, rather than trailing behind.
Your phone has a significantly smaller sensor (and lower resolution). Camera companies can do 4K60 full frame, but not for $2k currently, without significant effects, such as poor rolling shutter. Give it a bit of time. Perhaps Panasonic will wow us with this. We'll see.
 
I just bought a Samsung Galaxy Note 9. It does 4K 60p. I really wish the camera companies would move to the forefront of imaging technology, their sole purpose, rather than trailing behind.
Your phone has a significantly smaller sensor (and lower resolution). Camera companies can do 4K60 full frame, but not for $2k currently, without significant effects, such as poor rolling shutter. Give it a bit of time. Perhaps Panasonic will wow us with this. We'll see.
Good point about the lower resolution sensor. I'm not quite understanding how the size of the sensor would make a difference, given the signals move at the speed of light.
 
I just bought a Samsung Galaxy Note 9. It does 4K 60p. I really wish the camera companies would move to the forefront of imaging technology, their sole purpose, rather than trailing behind.
Your phone has a significantly smaller sensor (and lower resolution). Camera companies can do 4K60 full frame, but not for $2k currently, without significant effects, such as poor rolling shutter. Give it a bit of time. Perhaps Panasonic will wow us with this. We'll see.
Good point about the lower resolution sensor. I'm not quite understanding how the size of the sensor would make a difference, given the signals move at the speed of light.
It's just a general, realistic issue. I'm not in the sensor fabrication industry, so I can't comment too specifically; but this is a real challenge. At worst, it may just be related to the physical size of the pixels on each sensor wafer, as sensors aren't produced individually (they are "cookie cut" from a larger wafer). Could just boil down to costs.
 
The HD 1080p6 footage recorded on the Z6, is good enough to upres to 4K in post to cut in with 4K footage as slowmo.
Good call. In fact, Z6 HD 1080/120p upconverted to 4k is pretty impressive. For a falling objects scene, I went with shutter speed of 1/250 and used Red Giant Shooter Instant 4K plugin to upconvert.

Only downside found so far is that 120p does not output to HDMI to my field monitor (60p does). But I can work around this.
 
Pretty intense.
 
Pretty intense.
FWIW, some quick stats for 28sec Z6 clip:

H.264 (High, L 5.1), 1920 x 1080, 16:9, 119.88p, 124.52 MBit/s

HD File size: 444MB

Time to copy XQD to SSD: <3 secs

Time to upconvert from HD to 4K: 27 secs

4K file size: 2.4GB
 
Pretty intense.
FWIW, some quick stats for 28sec Z6 clip:

H.264 (High, L 5.1), 1920 x 1080, 16:9, 119.88p, 124.52 MBit/s

HD File size: 444MB

Time to copy XQD to SSD: <3 secs

Time to upconvert from HD to 4K: 27 secs

4K file size: 2.4GB
Yes, but it should be noted that this is after processing (including hardware processing).

Before all of that, the sensor still needs to read out the pixels and sample / scale the information down to that, upstream.
 
Also, the faster the frame rate at a given resolution like UHD, the more heat the sensor generates, which needs to be cooled down to reduce background noise, the more heat, the more noise is generated.

i believe the Z6 does not line skip for UHD or HD video up to 60fps, aftert that, it appears to be line skipping as the quality takes a dive.
Cheers
 
Also, the faster the frame rate at a given resolution like UHD, the more heat the sensor generates, which needs to be cooled down to reduce background noise, the more heat, the more noise is generated.

i believe the Z6 does not line skip for UHD or HD video up to 60fps, aftert that, it appears to be line skipping as the quality takes a dive.
Cheers
In less formal tests I've done, this seems correct. There seem to be 3 distinct read / record methods:
  1. 120 FPS, it seems to line skip. Probably 100 FPS as well, but haven't checked.
  2. 60 FPS, it seems to have full quality, but the least amount of lag. If I'm shooting 1080P, this is the mode I would use.
  3. 50 FPS and below, it seems to have full quality, but significant lag. This includes 4K. If I'm shooting 4K, I'll use 24FPS.
#3 is counter intuitive for me--I would think that the slower frame-rate with the same image quality would result in the least lag. So perhaps there is a difference in quality that is relatively imperceivable.

Should also be noted that video bitrates & codecs are "per second"--meaning that if there are twice as many frames per second, each high-motion frame could potentially have half the quality. Key word: potentially. There are so many variables that this is almost never the case.

In any event, 4K @ 60 FPS would be 4x more data to crunch than 1080/60, and twice as much data as 1080/120.
 
Also, the faster the frame rate at a given resolution like UHD, the more heat the sensor generates, which needs to be cooled down to reduce background noise, the more heat, the more noise is generated.

i believe the Z6 does not line skip for UHD or HD video up to 60fps, aftert that, it appears to be line skipping as the quality takes a dive.
Cheers
In less formal tests I've done, this seems correct. There seem to be 3 distinct read / record methods:
  1. 120 FPS, it seems to line skip. Probably 100 FPS as well, but haven't checked.
  2. 60 FPS, it seems to have full quality, but the least amount of lag. If I'm shooting 1080P, this is the mode I would use.
  3. 50 FPS and below, it seems to have full quality, but significant lag. This includes 4K. If I'm shooting 4K, I'll use 24FPS.
#3 is counter intuitive for me--I would think that the slower frame-rate with the same image quality would result in the least lag. So perhaps there is a difference in quality that is relatively imperceivable.

Should also be noted that video bitrates & codecs are "per second"--meaning that if there are twice as many frames per second, each high-motion frame could potentially have half the quality. Key word: potentially. There are so many variables that this is almost never the case.

In any event, 4K @ 60 FPS would be 4x more data to crunch than 1080/60, and twice as much data as 1080/120.
What do you mean by lag? Rolling shutter?
 
What do you mean by lag? Rolling shutter?
No, not rolling shutter, but rather latency.

There is a delay in what's happening in real life vs. when it is shown on the rear LCD / EVF.

It's pretty poor at 24, 25, 30, 50 FPS. My measurements:
  • In 4K24, the lag is around 0.26 seconds
  • In 1080P120, the lag is around 0.13 seconds
  • In 1080P60, the lag is around .06 seconds
  • In 1080P50, the lag is around 0.2 seconds
  • In 1080P24, the lag is around 0.26 seconds
The key is that something special happens at 60 FPS that doesn't happen at 50 or 24 (didn't test 25 & 30).

This 60FPS mode has the least EVF lag / latency; and 24FPS has the same latency whether it's 1080 or 4K. Not sure if this is due to the sensor readout modes (maybe it switches to 12-bit?), intermediate processing, buffering, etc.

By the way, you can test this as well. It's quite a simple test.
  1. Get a stop watch / phone app
  2. Set the camera up in live view, and point it at the phone/stopwatch. Start the video recording.
  3. Using a second camera at a high enough shutter speed (at bare minimum 1/100 second, but 1/1000 is better), take a picture of both the stopwatch and the rear screen. Ideally, try to get the stopwatch and rear screen to align horizontally in the frame when you take the picture.
When you look at the picture, the stopwatch and the rear LCD will display different times. The difference in these times is the lag / latency. Example:

7771a392bfbe4183bd930bc40b41d99c.jpg.png
 
Last edited:
The EVF has a native refresh rate of 60fps, so this may be why you are seeing the least lag time here. Have you measured the lag in the HDMI output? Would be interesting to note what this is.
Cheers
 
The EVF has a native refresh rate of 60fps, so this may be why you are seeing the least lag time here. Have you measured the lag in the HDMI output? Would be interesting to note what this is.
Cheers
Yes, but a 60fps display should have no issues in displaying 24fps (or 30fps) with little-to-no lag. So I think this lag is related to:
  • sensor readout modes
  • image processing
  • display rendering
Most sensors have preset read modes, and they may speed up significantly by changing bit depth, line skipping, etc.

A small digression I am separating, to provide an example:

Here's one example (purely as an example, probably not the Z6 sensor) of a Sony, 24MP, full-frame sensor's read out modes and max frame rates per mode:
So, again for example, changing to 12-bit full readout mode (readout mode 1) doubles frame-rate compared to 14-bit. Line skipping (readout mode 21) again doubles this frame rate of not skipping (readout mode 1). Cropped 4K also increases readout speeds. Etc.

I believe the Z6 sensor is about twice as fast as this Sony sensor--DPReview claims the Z6's sensor is roughly twice as fast as the A7III sensor:
At this point, we only have a few data points.

Jim Kasson clocked the Z6 eshutter in 14-bit at around 1/22 seconds, and the 12-bit @ 1/37 seconds. My guess is that this could actually be 1/24 sec and 1/36 sec.
(This doesn't necessarily relate to the video readout modes, but this should be the "highest quality" readouts).

We've also gotten some hints about video readout modes. For example, in this interview, where Atomos talks about raw video feeds, including 12-bit raw 4K (or more):



And we also know from press material that the Z6's 4K is oversampled footage.

All in all, I think the lag is primarily due to different read out modes, and some may not be as easy to spot as others. My guess is that Nikon is probably using different bit depths and sampling methods between the modes--probably sacrificing lag for image quality. Bit depths could be 14-bit in 1080/24, and only 12-bit in 4K/24, for example.

Also, Nikon probably changes from oversampling to pixel binning at 1080/60, and then to line-skipping by 1080/120.

So one possible scenario for my tested modes is:
  • 4K/24 = 12-bit oversampled
  • 1080/120 = 12-bit line-skipping
  • 1080/60= 12-bit pixel binning
  • 1080/24 = 14-bit oversampled
And I would further guess that the slow-motion modes (eg. 1080/30 x4) enable line skipping, since they have visibly less viewfinder lag, but also exhibit visibly more noise.

I don't understand how their 1080/50 works though (but nor do I really care--I don't use this mode). They're probably pixel binning or line skipping, but the increased lag relative to 60 fps suggests they're doing something deeper than the 12-bit pixel binning.

Perhaps someone with more expertise than me can chime in.
 
Last edited:

Keyboard shortcuts

Back
Top