Two lenses, one sensor, for perfect alignment and color/exposure in each view. 8k DCI RAW video capability. Canon (free, with a limitation) VR Utility software enables processing of 8K clips including RAW clips. Many camera focusing and framing aids.
So, I tried it out. Here are two test videos in 3D VR180, 4K each eye. If you do not view with a VR headset, you will see in 2D but you can move around with your mouse or by moving your phone, so you get the VR180 part. With a headset, you get to see the video is 3D and you can move around.
In this 4K video the clips were shot in 8K DCI All-I HEVC 10bit Clog3 and transformed with a Canon LUT in the VR Utility program:
In this 4K video the clips were shot in 12bit 8KDCI RAW:
Three-step workflow I used: processing, including trimming and DeBayering, of each clip in the Canon VR utility, which rendered 8K 3D VR180 H264 8bit clips (there are other choices). Those clips were then merged losslessly. The Google VR metadata injector program (free) was used to insert (losslessly) the proper 3D VR180 metadata into the final video so YouTube can process the uploaded clips appropriately.,
So, I tried it out. Here are two test videos in 3D VR180, 4K each eye. If you do not view with a VR headset, you will see in 2D but you can move around with your mouse or by moving your phone, so you get the VR180 part. With a headset, you get to see the video is 3D and you can move around.
In this 4K video the clips were shot in 8K DCI All-I HEVC 10bit Clog3 and transformed with a Canon LUT in the VR Utility program:
In this 4K video the clips were shot in 12bit 8KDCI RAW:
Three-step workflow I used: processing, including trimming and DeBayering, of each clip in the Canon VR utility, which rendered 8K 3D VR180 H264 8bit clips (there are other choices). Those clips were then merged losslessly. The Google VR metadata injector program (free) was used to insert (losslessly) the proper 3D VR180 metadata into the final video so YouTube can process the uploaded clips appropriately.,
Last edited:
