L
Lightgreen
Guest
That's how it worked in former 4k implementation, literally with MJPEG being a motion JPEG codec, where both were done "on-DIGIC"Think of a computer that has two powerful programs on it. Independently they can run smoothly. Run them together and the computer crashes. If you want to run them together, your only option is to buy a more expensive computer with a bigger processor.I noted it's the MP4 (h264 codec).
Good chance DPR is right and future models that sport 4k, will be both MP4 (not MJPEG), and, I don't see why they can't do DPAF. The latter would appear to be a clear neutering of the video, which makes some sense as DPAF is a very Canon tech, and a coveted one at that for video. Keeping 4k and DPAF away from each other on an entry level model is logical. It also shows it's intentional, not a technical deficiency. Interesting.
When recording video there are two main programs that are running. One of them is the autofocus and the second one captures the images from the sensor and records it to the SD card. This is how DPAF works: (from Canon's web site)
It is; that many frames per second in just JPEG, forget h264, is messy.Each pixel on the CMOS imaging sensor has two separate, light-sensitive photodiodes, which convert light into an electronic signal. Independently, each half of a pixel detects light through separate micro lenses, atop each pixel. During AF detection, the two halves of each pixel -- the two photodiodes -- send separate signals, which are analyzed for focus information. Then, an instant later when an actual image or video frame is recorded, the two separate signals from each pixel are combined into one single one, for image capturing purposes. This greatly improves AF speed over the majority of the area on which you’re focusing. The result is phase-detection autofocus, which surveys the scene and recognizes not only whether a subject is in focus or not, but in which direction (near or far), and by how much.
That sound to me like a program that is CPU intensive.
What's obviously happened like our cellphone maker counterparts, is with the advent of h264 4k in the M50, at 25fps at that, that's too much heat to brute force on DIGIC; they've obviously adopted a licensed h264 transcoder chip like everyone else and bit the bullet. Now heat is only an issue on sensor readout, not CPU-side, which limiting to a 1.6x crop does help as that's alot of data you're still pulling off the sensor at 25FPS. Cropping it, puts you on a similar playing field to a smaller sensor that's doing 4k, like say a Sony RX100.When recording 4K video the camera's processor has to capture and process twice the amount of data then 1080p. That sounds to me like anther CPU intensive program.
Combine these programs together and you need a bigger and faster processor. To expect that any camera manufacturer can provide 4K with good AF for under $1000 is totally unrealistic. People point out Sony. Personally I do not trust Sony. They really push the limits of their camera's processor. Both the Sony a6000 and a6300 had overheating problems. The a6500 was released six months later to solve the problems of the a6300 with an increase in price. The Sony A9 also has overheating problems. There is a reason why the battery life sucked on Sony cameras (the A73 and A9 appear to be better). Extra power is needed to cool the processors. Image resourcing did a rain simulation test in which the A7R3 failed badly. Sony does not properly weather seal the bottom part of their cameras. Weather sealing keeps the water out of the camera, but also keeps the heat in. They skip corners to pack a lot of features. Everybody keeps talking about how great of a video cameras Panasonic makes. But guess what. AF in 4K sucks as well. But it is better in 1080p. Why?
DPAF works in 1080, and in stills, this is a DPAF sensor, still.
Firing just one side and not the other, is not buying you anything here.
The way DPAF works is the right and left cells fire concurrently during capture, thus it's a 24MP readout, but really twice as many pixels actually exist then necessary to produce that 24MP readout, hence the 5DSR has that pixel shift option on RAWs.
Mind you, the 1DX II, and 5DIV can already do DPAF + 4k, on a much older DIGIC, that is already brute forcing and doing DPAF calculations at the same time. I don't buy it.
I know what you're saying, but you're smart enough to know what I'm saying too, I gather.
You're saying heat is still an issue, I'm saying, with h264 support, they've moved away from brute force, which is a 10x drop in cpu "cost". Now the sensor, whole different ball game, hence the crop makes some sense.
Last edited: