Lytro is bringing its Light Field technology to the world of cinema and visual effects, shortly after its CEO announced in a blog post Lytro's intention of abandoning the consumer stills camera space. Lytro Cinema turns every frame of live action into a 3D model, capturing intensity, color, and angular information of light rays. Coupling light field with a 755 MP sensor capable of capturing images at 300 fps, Lytro Cinema promises extensive post-production freedom, including adjustment of focus, depth-of-field, shutter speed and frame rate. It also promises to kill off the green screen, using depth information - instead of chroma keying - to swap out scene elements or easily composite in CGI.

Although Lytro experienced some difficulty in achieving widespread consumer adoption of light field technology in stills, the technology had, and continues to have, immense potential for imaging. Saving creative decisions for post-processing allows for more creative freedom, and allows a photographer or DP to focus on other elements during capture. Nowhere will this be more appreciated than in cinema, where the realities of production mean that any technology aimed at saving certain creative decisions, like focus, for post-capture are most welcome.

Focus and aperture sliders in post-production. In video. No joke. I wish my Raw converter had this (Lytro's Raw converter already does). Photo credit: Lytro

And that's exactly what Lytro Cinema aims to do. And with the likes of Robert Stromberg, showcased in the video, on-board, Lytro may already be succeeding (Stromberg is a two-time Oscar winner known for his work on The Hunger Games and Pan's Labyrinth). By capturing directional information about light rays in addition to intensity, Lytro Cinema essentially captures multiple focal planes, multiple perspectives, and multiple apertures. This allows for adjustment of focus placement, depth-of-field (via aperture adjustment), perspective, and more in post-processing. And since a depth map is rendered for every frame of video, Lytro claims Cinema will make it easier to combine CGI with live footage, no longer requiring green screens to extract elements or subjects from a scene. You'll be able to just extract a subject based on its depth, which Lytro shows in a convincing example below:

Since light field cameras effectively sample multiple perspectives from behind the lens, you can output 3D image pairs for any stereo viewing disparity. Perhaps more excitingly, the multiple perspectives mean you can even simulate camera movement as if it were moved on-set. The degree of motion is of course limited, but the technique can be very effective, as demonstrated in this haunting music video shot entirely on the stills-focused Lytro Illum. As Lead Engineer for Light Field Video Brendan Bevensee explains: "You have a virtual camera that can be controlled in post-production." That means there's also nothing stopping one from simulating short dolly motion or perspective shifts in post, with nothing but a static camera at the time of capture. "You can shift the camera to the left... [or] to the right, as if you had made that exact decision on set. It can even move your camera in and out" says Head of Light Field Video, Jon Karafin.

Imagine small, smooth, meditative camera movements that don't even require a complicated motion rig to set up.

Furthermore, by precisely recording X, Y, Z, pitch, roll, and yaw, Lytro Cinema even offers automated camera tracking, which makes it easier to composite and mat CGI elements. And just as the Illum paired with Lytro Desktop software allowed one to select various objects and depths to throw them in and out of focus for selective depth-of-field and background blur, one can do the same in video with the Cinema, choosing, for example, to marry live footage from minimum focus to, say, 10m with different footage, or CGI, for everything beyond those distances. In other words, control over not just single planes, but ranges of planes.

Beyond just light field benefits, Lytro is also addressing another common headache: the selection of shutter angle (or shutter speed). Often, this is a decision made at the time of capture, dictating the level of blur or stuttering (a la action scenes in 'Saving Private Ryan' or 'Gladiator') in your footage. At high frame rates of capture, though, high shutter angles are required, removing some of the flexibility of how much motion blur you can or can't have (e.g. 300 fps cannot be shot with shutter speeds longer than 1/300s, which inevitably freezes action). By decoupling the shutter angle of capture from the shutter angle required for artistic effect, a DP can creatively use motion blur, or lack thereof, to suit the story. The technology, which undoubtedly uses some form of interpolation and averaging in conjunction with the temporal oversampling, also means that you can extract stills with a desired level of motion blur. 

Lytro claims that by capturing at 300 fps, they can computationally allow for any of a number of shutter angles in post-production, allowing a cinematographer to decouple shutter angle required for capture from that required for artistic intent. Photo credit: Lytro

If there's one thing you can't accuse Lytro of lacking, it's vision and ambition. Every development by the company has gotten us excited by the implications for both stills and video. The ramifications for the latter, in particular, have always been compelling. The tricky part is in ensuring the benefits of light field don't come at too high a cost in terms of traditional image quality aspects. 755 MP should, hopefully, offset the large resolution cost to sampling directional information, and afford a high enough resolution depth map to avoid artifacts. The claimed 16 EV dynamic range - an attribute important to cinematographers - also sounds promising.

Along with the announcement of the Lytro Immerge 360º virtual reality light field rig, we're extremely excited to see light field video becoming a reality, and look forward to what creatives can produce with what is poised to be an unimaginably powerful filmmaking platform. Filmmakers can sign up for a demonstration and a personalized production package on Lytro's site. For now, Lytro Cinema will be available on a subscription basis, understandable given the complexities involved (the immense data capture rates require servers on-set).

Head over to the Lytro Cinema page for more in-depth information. Lytro will be demo-ing "Life", a short film shot using Lytro Cinema at NAB 2016.

Lytro Brings Revolutionary Light Field Technology to Film and TV Production with Lytro Cinema

  • World’s First Light Field Solution for Cinema Allows Breakthrough Creative Capabilities and Unparalleled Flexibility on Set and in Post-Production

  • First Short Produced with Academy Award Winners Robert Stromberg, DGA and David Stump, ASC in Association with The Virtual Reality Company (VRC) Will Premiere at NAB on April 19

Lytro unlocks a new level of creative freedom and flexibility for filmmakers with the introduction of Lytro Cinema, the world’s first Light Field solution for film and television. The breakthrough capture system enables the complete virtualization of the live action camera -- transforming creative camera controls from fixed on set decisions to computational post-production processes -- and allows for historically impossible shots.

“We are in the early innings of a generational shift from a legacy 2D video world to a 3D volumetric Light Field world,” said Jason Rosenthal, CEO of Lytro. “Lytro Cinema represents an important step in that evolution. We are excited to help usher in a new era of cinema technology that allows for a broader creative palette than has ever existed before.”

Designed for cutting edge visual effects (VFX), Lytro Cinema represents a complete paradigm shift in the integration of live action footage and computer generated (CG) visual effects. The rich dataset captured by the system produces a Light Field master that can be rendered in any format in post-production and enables a whole range of creative possibilities that have never before existed.

“Lytro Cinema defies traditional physics of on-set capture allowing filmmakers to capture shots that have been impossible up until now,” said Jon Karafin, Head of Light Field Video at Lytro. “Because of the rich data set and depth information, we’re able to virtualize creative camera controls, meaning that decisions that have traditionally been made on set, like focus position and depth of field, can now be made computationally. We’re on the cutting edge of what’s possible in film production.”

With Lytro Cinema, every frame of a live action scene becomes a 3D model: every pixel has color and directional and depth properties bringing the control and creative flexibility of computer generated VFX to real world capture. The system opens up new creative avenues for the integration of live action footage and visual effects with capabilities like Light Field Camera Tracking and Lytro Depth Screen -- the ability to accurately key green screens for every object and space in the scene without the need for a green screen.

“Lytro has always been a company thinking about what the future of imaging will be,” said Ted Schilowitz, Futurist at FOX Studios. “There are a lot of companies that have been applying new technologies and finding better ways to create cinematic content, and they are all looking for better ways and better tools to achieve live action highly immersive content. Lytro is focusing on getting a much bigger, better and more sophisticated cinematography-level dataset that can then flow through the VFX pipeline and modernize that world.”

Lytro Cinema represents a step function increase in terms of raw data capture and optical performance:

  • The highest resolution video sensor ever designed, 755 RAW megapixels at up to 300 FPS
  • Up to 16 stops of dynamic range and wide color gamut
  • Integrated high resolution active scanning

By capturing the entire high resolution Light Field, Lytro Cinema is the first system able to produce a Light Field Master. The richest dataset in the history of the medium, the Light Field Master enables creators to render content in multiple formats -- including IMAX®, RealD® and traditional cinema and broadcast at variable frame rates and shutter angles.

Lytro Cinema comprises a camera, server array for storage and processing, which can also be done in the cloud, and software to edit Light Field data. The entire system integrates into existing production and post-production workflows, working in tandem with popular industry standard tools. Watch a video about Lytro Cinema at

“Life” the first short produced with Lytro Cinema in association with The Virtual Reality Company (VRC) will premiere at the National Association of Broadcasters (NAB) conference on Tuesday, April 19 at 4 p.m. PT at the Las Vegas Convention Center in Room S222. “Life” was directed by Academy Award winner Robert Stromberg, Chief Creative Officer at VRC and shot by David Stump, Chief Imaging Scientist at VRC.

Learn more about Lytro Cinema activities during the 2016 NAB Show and get a behind-the-scenes look on the set of “Life” at

Lytro Cinema will be available for production in Q3 2016 to exclusive partners on a subscription basis. For more information on Lytro Cinema, visit