The Lytro Illum is aimed at changing photography as a medium. Even if it fails, light field technology holds some exciting possibilities for many areas of imaging.

The original Lytro camera launched more than two years ago. Billed as 'Camera 3.0', critics panned it as an underpowered product with limited practical or artistic value. But it introduced photographers to the concept of light field imaging - or at least its key selling point - namely, the ability to refocus photos after they’ve already been captured.

Since then, smartphones have begun to incorporate post-capture refocusing into their camera apps. It's often referred to as a 'Lytro-like' feature by tech writers, though in most cases, the effect is achieved with no special optical hardware - just a regular mobile camera, working with great processing to allow you to capture multiple focus-bracketed images at high speed, and choose from them after capture.

With no new consumer-grade light field cameras announced since early 2012 and Lytro’s calling-card feature absorbed as a toss-in feature on smartphones, the future of light field camera technology was starting to look uncertain.

But last month the future started to snap back into focus, with the announcement of the Lytro Illum. In the weeks leading up to its announcement, we spoke with Jason Rosenthal, CEO of Lytro, Ren Ng, founder and chairman of Lytro and Kartik Venkataraman, CTO at Pelican Imaging to get a sense of where this technology is headed in the next few years and beyond.

Lytro’s new hardware certainly packs in a few tricks that will appeal to some still photographers, but its aim is really to introduce a new type of imagery. Time will tell if that medium succeeds, but either way, light field technology has the potential to change the way all cameras are designed because it presents some truly exciting possibilities for lens design, scientific imaging, and video holography.

New and Improved

The Illum is clearly a better camera than the original Lytro Light Field Camera. The specs are much closer to what we expect from today's mid-range cameras: A bigger 1"-type sensor, a much faster Snapdragon 800 processor, cleaner and clearer images that now equal about 5MP in two-dimensional terms, physical buttons and dials, and an overhauled interface, to name a few. The press release mentions the Illum will integrate with software from Adobe and Apple, a hint that light field photographers will now be able to adjust color, contrast, detail, and other traditional photo parameters.

The real step up though, is that the Illum has the visual chops to illustrate a crucial point: Light field photos, at least in this iteration, are a separate medium from still photos. The depth information adds a layer of interactivity to conventional photos - and interactivity is not strictly better, just different. Photographers now have the option to start working in a responsive medium.

The biggest change in the Illum is the larger sensor - now a 1"-type vs. 1/3"-type found in smartphones.

Jason Rosenthal, CEO of Lytro, says part of the aim of the Illum is 'making pictures come alive, or changing the definition of what an image is all about versus the flat, 2D snapshots that we've lived with for the last 175 years'. Now we'll just have to see if artists or viewers have any interest in responsive images.

Here’s a quick, rough refresher on how the Illum works as a light field array camera: Right before light hits the sensor, it passes through a microlens array. That allows the sensor to record directional information about light, in addition to color and intensity. 

The sensor in the Illum has an underlying resolution of 40 million photosites - that is, it collects 40 million points of information. The processor decodes about 5 million points of that information as spatial or flat resolution, which we measure in megapixels. The rest is left for angular resolution, which we interact with as the refocusing range. We don't really have great vocabulary to describe depth measurement yet, though Lytro is trying to make the term 'megarays' stick, with mixed results.

Illustration from Lytro founder Ren Ng's dissertation on consumer light field cameras showing how a microlens array can redirect light before hitting the sensor.

So what does 40 'megarays' of light data look like? Photos are now much more immersive than anything we saw out of the first Lytro. Images can fill the entire height of a computer screen. The refocusing range is also more flexible, able to focus on close to a dozen planes in the frame rather than either the foreground or background.

This is partially because of the shallower depth of field afforded by the sensor size, but also because the Illum can capture more depth information. Likewise, the perspective shift effect is more pronounced. Images feel almost holographic.

Lytro’s proprietary 'living picture' software is also much smoother now. Instead of clicking around to interact, simply moving the mouse adjusts the focus and perspective in real time. When you do click to refocus, it transitions smoothly from the previous focal point to the new one - almost like moving through the scene.

Not many sample images have been released yet, but the best ones have a few layers of depth. This encourages the viewer to click around, playing up the interactivity and making the most of the format. In the examples from Lytro (see below), poke around the junkyard scene for the names carved into the tree, or the car scene for the creeper in the rearview mirror - they’re barely visible unless you interact with the shots.