Lytro poised to forever change filmmaking: debuts Cinema prototype and short film at NAB
|Lytro debuted its Cinema prototype to an eager crowd at NAB 2016 in Las Vegas, NV. It sports the highest resolution video sensor ever made.|
Lytro greeted a packed showroom at NAB 2016 in Las Vegas, Nevada to demo its prototype Lytro Cinema camera and platform, as well as debut footage shot on the system. To say we're impressed from what we saw would be an understatement: Lytro may be poised to change the face of cinema forever.
The short film 'Life', containing footage shot both on Lytro Cinema as well as an Arri Alexa, demonstrated some of the exciting applications of light field in video. Directed by Academy Award winner Robert Stromberg and shot by VRC Chief Imaging Scientist David Stump, 'Life' showcased the ability of light field to obviate green screens, allowing for extraction of backgrounds or other scene elements based off of depth information, and seamless integration of CGI elements into scenes. Lytro calls it 'depth screening', and the effect looked realistic to us.
Just as exciting was the demonstration of a movable virtual camera in post: since the light field contains multiple perspectives, a movie-maker can add in camera movement at the editing stage, despite using a static camera to shoot. And we're not talking about a simple pan left/right, up/down, or a simple Ken Burns effect... we're talking about actual perspective shifts. Up, down, left, right, back and forth, even short dolly movements - all simulated by moving a virtual camera in post, not by actually having to move the camera on set. To see the effect, have a look at our interview with Ariel Braunstein of Lytro, where he presents a camera fly-through from a single Lytro Illum shot (3:39 - 4:05):
The Lytro Cinema is capable of capturing these multiple perspectives because of 'sub-aperture imaging'. Head of Light Field Video Jon Karafin explains that in front of the sensor sits a microlens array consisting of millions of small lenses similar to what traditional cameras have. The difference, though, is that there is a 6x6 pixel array underneath each microlens, meaning that the image made up of only pixels on the sensor at any position (X,Y) underneath a microlens represents the scene as seen through one portion, or 'sub-aperture' of the lens. There will be 36 of these 'sub-aperture' images though, each providing one of 36 different perspectives, which then allows for computational reconstruction of the image with all the benefits of light field.
The 36 different perspectives affords you some freedom of movement in moving a virtual camera in post, but it is of course limited, affected by considerations like lens, focal length, and subject distance. It's not clear yet what that range of freedom is with the Cinema, but what we saw in the short film was impressive, something cinematographers will undoubtedly welcome in place of setting up motion rigs for small camera movements. Even from a consumer perspective, consider what auto-curation of user-generated content could do with tools like these. Think Animoto on steroids.
We've focused on depth screening and perspective shift, but let's not forget all the other benefits light field brings. The multiple perspectives captured mean you can generate 3D images or video from every shot at any desired parallax disparity (3D filmmakers often have to choose their disparity on-set, only able to optimize for one set of viewing conditions). You can focus your image after the fact, which saves critical focus and focus approach (its cadence) for post.* Selective depth-of-field is also available in post: you can choose whether you want shallow, or extended, depth-of-field, or even transition from selective to extensive depth-of-field in your timeline. You can even isolate shallow or extended depth-of-field to different objects in the scene using focus spread: say F5.6 for a face to get it all in focus, but F0.3 for the rest of the scene.
Speaking of F0.3 (yes, you read that right), light field allows you to simulate faster (and smaller) apertures previous thought impossible in post, which in turn places fewer demands on lens design. That's what allowed the Illum camera to house a 30-250mm equiv. F2.0 constant aperture lens in relatively small and lightweight body. You could open that aperture up to F1.0 in post, and at the demo of Cinema at NAB, Lytro impressed its audience with - we kid you not - F0.3 depth-of-field footage. A Lytro representative claimed even faster apertures can be simulated.
But all this doesn't come without a cost: the Lytro Cinema appears massive, and rightfully so. A 6x6 pixel array underneath each microlens means there are 36 pixels for every 1 pixel on a traditional camera; so to maintain spatial resolution, you need to grow your sensor, and your total number of pixels. Which is exactly what Lytro did - the sensor housing appeared to our eyes to be over a foot in width, sporting a whopping 755 million total pixels. That should mean that at worst, you'd get 755/36, or roughly 21MP final video output. Final output resolution was a concern with previous Lytro cameras: the Illum yielded roughly 5MP equivalent (sometimes worse) stills from a 40MP sensor. However, as we understand it, the theoretical lowest resolution of 21MP with the Cinema sensor means that output resolution shouldn't be a concern for 4K, or even higher-res, video.**
The optics appear as massive as the resolution, but that's partly because there are two optical paths: one for the 755MP light field capture, and the other to give the cinematographer a live preview for framing, focus, and exposure. The insane data rates for the light field capture, on the order of terabytes for every few seconds, means that Lytro Cinema comes with its own server on-set. The sensor is also actively cooled. The total unit lives on rails on wheels, so forget hand-held footage - for now. Bear in mind though, the original technicolor cinematic camera invented back in 1932 appeared similarly gargantuan, and Lytro specifically mentioned that different versions of Cinema are planned, some smaller in size.
Processing all that data isn't easy - in fact, no mortal laptop or desktop need apply. Lytro is partnering with Google to send footage to the cloud, where thousands of CPUs crunch the data and provide you real-time proxies for editing. Lytro stated the importance of integration with existing workflows, and to that end is building plug-ins to allow for light field video editing within existing editors - starting with Nuke. But Lytro is going a step further: they suggest the light field is the ultimate mastering format, and they're capable of converting all content - from footage to visual effects - into a 4D light field so you can, at any time, go back and re-render your film for any display device. This will be particularly important with the advent of holographic and other innovative light field displays.
The 4K footage from the Lytro Cinema that was mixed with Arri Alexa footage to create the short 'Life', viewed from our seating position, appeared comparable to what one might expect from professional cinema capture. CEO Jason Rosenthal commented that the short film was shot on both cameras to speak to how interchangeable footage can be with other cameras. Importantly, the footage appeared virtually noise free - which one might expect of such a large sensor area. Furthermore, Jon Karafin pointed out there are 'hundreds of input samples for every one output sample', which means a significant amount of noise averaging occurs, yielding a clean image, and a claimed 16 stops of dynamic range. In fact, in 'Life', noise had to be added back in to get the Lytro footage to match the Alexa.
That's incredibly impressive, given all the advantages light field brings. This may be the start of something incredibly transformative for the industry. After all, who wouldn't want the option for F0.3 depth-of-field with perfect focus in post, adjustable shutter angle and frame rate, compellingly real 3D imagery when paired with a light field display, and more? With increased capabilities for handling large data bandwidths, larger sensors, and more pixels, we think some form of light field will exist perhaps in most cameras of the future. Particularly when it comes to virtual reality capture, which Lytro also intends to disrupt with Immerge.
It's admirable just how far Lytro has come in such a short while, and we can't wait to see what's next. For more information, visit Lytro Cinema.
* If it's anything like the Illum, though, some level of focusing will still be required on set, as there are optimal planes of refocus-ability.
** We're not certain of the actual trade-off for the current Lytro Cinema. It's correlated to the number of pixels underneath each microlens, and effective resolution can vary at different focal planes, or change based on where focus was placed. This may be one reason for the overkill resolution - to ensure that at worst, capture is high resolution enough to meet high demands.
The Apple iPad has always been a tempting tool for photographers on the go: slim, lightweight, and now with the current iPad Pro models, powerful enough to do most photo tasks you'd accomplish on the desktop. The software isn't quite there yet, but the next release of iPadOS should bridge that gap.
Ever get those moments when all your creativity leaves you? Yes, well so do we all. Here YouTuber Jamie Windsor shares his tips for getting back on track
Colin Goudie is a film editor with a career spanning over 35 years, known most recently for his work on Rogue One: A Star Wars Story. He talked with DPRevew about editing movies back in the film days, and the transition to a fully digital workflow.
Photographers used to do crazy things like smear petroleum jelly on their lenses to create interesting photos, but thanks to the Lensbaby Omni you can get those same effects without plopping goo on your glass. Join Chris and Jordan for some creative photography.
The ‘Overall’ winning shot was captured with a DJI Phantom 4 Pro, marking the first time the winning image has ever been captured with a drone.
The system can simulate the camera's movement and the set lighting to perfectly match the background with the scene.
Optical Character Recognition (OCR) isn't new technology, but this does mark the beginning of an era where you can search for text found within your images hosted on Google Photos.
Photographer Aryeh Nirenberg used an astro-modified Sony a7S II to capture the 1,100 images that went into making this 55-second timelapse.
The Sony RX100 VII takes the place of its RX100 VA sibling as our top overall pick, while the Canon G5 X II replaces the Panasonic LX100 II as our alternate choice.
A recent screenshot from the German Nikon Professional Service website shows that only 1,000 of Nikon's 500mm F5.6 PF ED VR lens are being produced each month.
Viltrox has shared photos and specifications on its official Weibo account of three upcoming APS-C lenses for Fujifilm, Sony and Leica camera systems.
Weird lens aficionado Mathieu Stern quite literally got more than he bargained for when he paid just €2 for a rare projector lens that creates some of the most intense swirly bokeh we've ever seen.
The short video shows off the silhouettes of four new lenses — one large lens and three compact lenses — alongside two current Tamron Sony E mount lenses.
The Sony Cyber-shot DSC-RX100 VII is the most capable pocket camera currently on the market thanks to a combo of good image quality, smooth stabilized 4K and an industry-leading autofocus implementation. For these reasons it receives our gold award.
New top end calibration package aims to reduce waste when printing on difficult surfaces by making color measurement more accurate
Moment's new 37mm Cine filters are compatible with various models of iPhone, Pixel, OnePlus and Galaxy devices.
Instagram has dismissed another viral spam image that is circulating on its platform, this one claiming that, starting tomorrow, all user content will be made public (including deleted messages) and that the company will be able to use images against users in court.
The upcoming products are designed to create a ‘complete line of photo and video products’ designed for photographers of all levels.
Sony's FE 35mm F1.8 answers a lot of a7-series photographers' prayers. But was it worth the wait? Find out in our full review.
Nikon has finally made it possible to transfer Raw images from their Wi-Fi-capable cameras to smartphones and tablets running the new SnapBridge 2.6 application.
DroneDJ conducted a comprehensive search of DJI's official online store and noticed most models were out of stock.
The new app, which is limited to iOS, for the time being, makes it easy to deliver images to clients, who can easily sort through and download images on-the-go.
The adapter uses a six-element design to make the most of even the fastest Hasselblad V lenses on Fujifilm's GFX mount camera systems.
Huawei's upcoming high-end devices are likely to catch up with Apple and Samsung in terms of 4K video frame rates.
In this video we’ve traveled to southern Spain with the Olympus OM-D E-M1X. There, we headed for the town of Sevilla to meet up with action sports photographer Fernando Marmolejo.
Henry Diltz recounts how he became the official photographer of Woodstock and shares what it looked like through the viewfinder.
Canon Australia appears to have leaked two upcoming cameras in a pair of promotional videos - an ‘EOS M6 II’ and an ‘EOS 90D.’
The adapter sits inside the camera and compresses the lens image to fit the camera's Super 35mm sensor, and restoring the look of the original focal length of the lens
Sydney-based coder Greig Sheridan and his photographer partner Rocky have introduced Intervalometerator, an open-source intervalometer designed for deploying inexpensive remote time-lapse systems involving Canon DSLRs, Arduino and Raspberry Pi hardware.
The lens, set to ship later this year for a yet-to-be-determined price, is an update to Yongnuo's original 35mm F1.4 lens that adds an ultrasonic motor.