Emmanuel Lubezki: 'Digital gave me something I could never have done on film'
3 Editor's Note
By Rishi Sanyal
It's not every day you get to chat with not only a leader in a field making (movie) history, but someone who is also a great personal inspiration. Every single frame of The Tree of Life is a photograph to aspire to, every movement of the camera in The New World brimming with intent to evoke a particular feeling. Use of wide-angles lenses close and intimate with subjects (a technique that personally speaks to me) creates immersive experiences. As we began our phone conversation, we were giddy at the thought of getting some insight into Emmanuel 'Chivo' Lubezki's genius. And Emmanuel's down-to-earth approach-ability and candor meant we got right down to talking shop.
And as much as he'll stress that he's not a technical guru, it must be stated that Emmanuel is as unassuming as he is brilliant: I had the pleasure of meeting him in person a couple years back at NAB, and within seconds he was asking me what I thought of the newest cameras (I have a feeling he didn't need my advice), before moving on to talk about the convolution of both camera and actor motion to simulate a magnitude of movement on-screen not possible to subject a real human being to (when filming Gravity), or how camera motion mimics character motion in order to better connect the viewer with the subject (watch how the camera jumps along with Pocahontas in The New World, or how it bobs up and down as Sandra Bullock spins in zero-G in Gravity).
So we didn't hesitate to get down and technical with Mr. Lubezki. And for our audience that meant, at least in part, understanding the way Emmanuel makes images from a fundamental standpoint. So what camera does Emmanuel shoot with for much of his personal stills work? Back in 2014, it was the Nikon D800. And now? The Nikon D810. If there's a trend you've noticed, there's good reason for it. Emmanuel is a master of light, and recreating the dramatic light of fleeting, powerful moments often means capturing them appropriately in the first place to manipulate them in creating the final product. Allow us to elaborate.
If there's one thing clear from Emmanuel's body of work, it's that he'll go to any length to get the perfect shot. No detail is too small to ignore, not even the shape or color of a sunburst peeking through the leaves, if only ephemerally (one might even say it's the fleeting moments that are most memorable). That quest for perfection puts immense demand on the production, and video is not a medium where you can just take another shot every time you don't get it just right. Especially not when shooting film, where often it's hard to know at all whether or not you got it right until viewing dailies. The high cost of not getting it right means that many pro-videographers sweat the technical details, and justifiably so. By recording as much as possible at the time of capture, certain creative decisions can be saved for post, allowing the artist to focus on the things not as easily changed or manipulated during capture.
In the past, I wouldn't have dreamt of capturing this shot in a single exposure. Now, with high dynamic range sensors, I can, but by altering my exposure philosophy. Instead of exposing for my main subject, I expose for highlights, tonemapping darker underexposed tones for dim, low dynamic range displays in post-processing.
Photo: Rishi Sanyal
That's why Emmanuel appears so interested in topics like VR, light field, and dynamic range - in general, rich capture mediums that allow for maximum flexibility post-capture.* More specifically, when it comes to dynamic range, if tones are over or under-exposed, you've often lost them, and the more you capture, the more latitude you have after-the-fact for creative intent. 'If there's one thing engineers can improve in cameras, it's dynamic range, dynamic range, dynamic range,' to paraphrase Mr. Lubezki at the Technical Summit, NAB 2014. In order to capture the incredible vistas and subject detail in the scenes he's wont to shoot, typically during the 'magic hour', Emmanuel must record the detail in the bright skies, sunset-lit clouds, and warm sun flares, as well as render the dark faces of naturally-lit subjects visible. Those subjects may have exposures many, many stops darker than the bright skies captured. Capturing both extremes requires a medium with extensive dynamic range.
In the days of film, you could typically set your exposure for your subject's face, and not worry (too much) about the sky behind your subject blowing to white. Because of the roll-off in exposure negative film displayed, above a certain threshold, film became less sensitive to light the more you exposed it, allowing you to overexpose to give most tones** a higher signal:noise ratio. Not so with digital, which tends to display a linear response to light: expose it two-fold as much, get twice the signal, up to a certain point above which color channels clip, and you're left with detail-less white. That's why it's so important to adopt a different exposure philosophy for digital, and it was fascinating to hear this stated in Emmanuel's own words: essentially, expose digital for highlights you wish to retain, because if you overexpose them, they may be lost forever.
But exposing for the highlights in high contrast scenes often means that darker tones are, traditionally speaking, 'underexposed'. Advents in digital capture technology mean, though, that these underexposed tones can still be fairly usable if you brighten them (tonemapping) so that they're visible on our current low dynamic range, low brightness displays. What's more, these underexposed tones tend to have higher signal:noise ratios than similar tones recorded on film (see DXO's research on why digital may have already surpassed film, when using high thresholds for acceptable noise levels) - meaning these darker tones can still be relatively noise-free. They'll still be limited by shot noise, as all capture mediums are, so more exposure will always yield better results (more light always means less noise) particularly for shadows that start off with less light to begin with. But low levels of electronic read noise coupled with the high pixel saturation capacities of, for example, the Nikon D810 at ISO 64, or larger sensors cameras like the ARRI Alexa 65 Chivo used, mean extensive light capture ability and low noise for all tones, especially tonemapped shadows. These shadows can be so devoid of noise that landscape photography masters like Marc Adamus advocate foregoing graduated neutral density filters in favor of exposing-for-the-highlights (or 'exposing to the right', ETTR) and tonemapping when possible.*** We've previously visually demonstrated the advantages of the D810 at ISO 64 over similar cameras when exposed in this manner, and present those results again here (note the increased detail, and lower noise, in the D810 shadows in our widget below, where all shots were exposed properly for the highlights):
This gets to the heart of much of our discussion with Emmanuel: DPs (Directors of Photography) have learned quickly that when it comes to digital, exposing for the highlights and brightening, or tonemapping, shadows in post-processing is the way to work with this capture medium, when it comes to high contrast scenes. This is still a methodology arguably not well-appreciated in the stills sphere, and it was fascinating to hear from Mr. Lubezki himself that it's a quickly adopted approach in the video world. But even this methodology is limited: (1) shadows inherently are still noisy due to plain physics, and (2) our ability to expose optimally is still limited by the fact that we can't always see our tones accurately during the capture phase. Meaning that even if we want to optimize our exposure by giving the cameras as much light as possible, we often can't tell exactly when tones are irrevocably clipped to white (due to lack of Raw zebras/histograms on many cameras), or lost in murky, noisy shadows (because we can't see the shadows in their final, brightened or tone-mapped, form on a high-resolution, HDR output device during capture).
That's why Emmanuel builds his own look-up tables (LUTs) and installs them on his camera - to get a 'proxy' of how he might process the final footage to assess whether or not, during capture, his highlights and shadows are acceptable. Similar to, in a sense, the flat gamma profiles that come standard on many cameras these days that attempt to, at least in part, give you a sense of the dynamic range available for you to utilize in grading. DPs like Emmanuel Lubezki, of course, take this a step further and customize their own profiles more representative of how they might grade the footage (one might re-introduce some blacks, for example, to avoid the very flat look of flat log gamma profiles), to get a sense of how usable their footage is during capture. And you thought we were technical with our talk of 'ISO-invariance'...
The flip-side of this discussion is the output. HDR, or high dynamic range, is often understood in photography as the merging of exposures to overcome the limitations of current cameras in capturing high contrast scenes. This process can often lead to flat pictures, but it's important to understand that this is a limitation of our current low brightness, low dynamic range displays. When you pack all that tonal range into a limited output range, biased toward a much darker total output than what we're used to seeing in the real world, you get either dark shadows, or a flat looking image from raising those shadows nearer to the brightness levels of brighter tones. HDR displays change all that: brighter whites and darker blacks mean these displays are capable of recreating a range of tones closer to what we're used to seeing in the real world. But that means a whole new workflow: on such devices, shadows don't need as much brightening in shots exposed-for-the-highlights, because all tones are already shifted to the right by virtue of simply being displayed brighter.
Dark shadows you 'push' today (in tools like Photomatix or Photoshop) may need to be 'pulled' (darkened), or pushed less, on a brighter, higher dynamic range display. Cinematographers like Emmanuel are entirely familiar with the concept of editing in an output-aware manner, which is why different grades of The Revenant were created for a normal TV, HDR TV, cinema, etc. And this stresses the need for standards. We'd like to imagine a world where display attributes like brightness and dynamic range are properly profiled, just like color gamut already is, so that all grading can be done in a display-aware manner. Perhaps brightness and contrast edits done by a content creator on a profiled display could automatically re-scale for the dynamic range and brightness of the viewing device, taking into account human perception. Whether or not this is feasible is another manner, and grades will likely always benefit from being done on the intended output display device to optimize for the dynamic range and color gamut of the intended display. Just as today prints benefit from editing on a dim display (~90 nits) that better simulates the illumination of the print. With the advent of new technologies enabling drastically brighter, higher contrast, and wider color gamut displays, though, the need for some sort of standardization that gives content creators confidence that what they edit is what you see will becomes increasingly important.****
Many often talk about technical details as if they were separate and distinct from artistic vision. But what our conversation with Emmanuel has served to solidify in my mind is that the two serve and enable one another. Lubezki talks about an often subconscious appreciation of beauty a photograph or movie evokes, and what is often unappreciated is that this is the result of very intent-ful decisions, many of which are artistic and technical in nature. My hope is that advances in technology and standardization, and an open conversation, will unleash more creative freedom for visionary artists like Emmanuel.
A hearty thanks again to Emmanuel Lubezki!
* A large part of what computational photography is focused on.
** Beyond a certain point, overexposed tones in film would start to look noisy, since brighter tones are recorded as denser film, which means more film grain.
*** Remember though that shot noise will make underexposed tones noisier than brightly exposed tones, so you'll always be better off merging two different exposures, or using a grad ND filter to increase the foreground exposure. However, sensor advancements are increasing dynamic range to the point that cameras like the Nikon D810 yield underexposed shadows of single shots with noise levels roughly equivalent to full-frame ISO 1000 or 2000 after 4 or 5 EV pushes from base ISO, respectively. Certainly not unacceptable, for some.
**** To read more about challenges and efforts in standardization in this arena, visit the website of our friends over at SpectraCal.
|Hot Air Balloons Over Bagan by User9320321874|
|Blue mood by darub|
from Fixed lens shootout.
|Yellow Warbler by LeeS|
from A Big Year - birds
|Waiting for the Parade by tcoker1103|
from - La Vida Loca - (Black and White Street Photography+ A Border)
Peak Design's 'consider every detail' approach shines in the Everyday Backpack. While expensive, it's one of the best options out there for a photographer who needs to pack a lot of stuff in addition to gear.
If you're thinking of using Canon's sports glass on the Sony a9, think again. The ultra-fast camera slows way down when you attach off-brand glass.
The Polish town of Katowice is not known as an area of beauty, but as all photographers know, that doesn't mean that beauty can't be found if you know where to look. Mariusz Pietranek used a drone to look down on the colorful sedimentation tanks at an ironworks.
New York Times video journalist Ben Solomon spent a harrowing three weeks accompanying Iraqi Major Sajjad al-Hour as he and his men fought to retake Mosul from I.S. forces.
The 3D VR camera launched through a crowdfunding campaign in 2015 goes on sale beginning June 26.
Noctilucent clouds, a crescent moon and Venus were visible in the pre-dawn sky over Budapest yesterday. Photographer György Soponyai captured NASA's astronomy picture of the day.
Squirming pets won't sit still for photos? A Kickstarter campaign is looking to help.
Find out how Chris Burkard shifted from editorial photography to his true passions: landscapes, conservation and, of course, surfing.
The updated EyeEm app scans your camera roll and picks images that are composed particularly well, have the best quality, or highest chance of selling on EyeEm Market.
It's three years old but still a solid option for a Micro Four Thirds shooter looking for a high-quality, fast, wide-angle prime. Take a look at how we got along with it.
Tamron has announced the longest all-in-one zoom lens currently available, the 18-400mm F3.5-6.3 Di II VC HLD. Designed for Canon and Nikon crop-sensor cameras, the lens will be available in July.
When you're ready to step-up to full-frame from an entry-level or midrange camera, the choices can be overwhelming. Find out which models came out on top in our $1200-2000 enthusiast ILC roundup.
Just a guy wearing a VR headset, smashing invisible Goombas in Central Park.
NASA's Mars Reconnaissance Orbiter captured this gorgeous aerial photo of the Martian landscape. And if you look really close, you can actually see the Mars Curiosity rover in the very middle.
The city of Laguna Beach, California has provided some clarification around the kinds of photography permits it offers.
Later this year, a VR180 camera will be Joining Yi's Halo and 360 VR cameras, which will offer stereo 3D capture, yet be as easy to use and compact as a 2D camera.
Caltech researchers have developed an 'optical phased array' chip that uses time delays instead of a lens to focus the incoming light.
Pricing and shipping have finally been revealed for two highly anticipated lenses from Sigma, announced in February.
These macro photos of clouds of paint billowing through clear water might look like high-quality CGI, but they're real photographs. And photographer Alberto Seveso told us how they were made.
Facebook is testing a feature that prevents people from saving, sharing, or even taking a screenshot of your profile picture.
We've reshot the Sony a9 in our studio. The short story: it's sharper! The long story... well you can read it all here.
The collection will be officially launched during the Europeana Transcribathon Campus Berlin 2017 crowdsourcing event which will be held on 22 and 23 June at the Berlin State Library.
Light gives us some insight into the preparations for the launch of the pre-order shipments of its much anticipated L16 multi-lens camera.
OnePlus co-founder Carl Pei has confirmed in a tweet that the second lens on the back of the OnePlus 5 uses a 1.6x optical zoom and that digital zoom is used to reach the claimed 2x zoom factor.
Fujifilm recently unveiled the second in its series of affordable cine lenses, the MK50-135mm T2.9. We got our hands on it for a couple days and took it for a spin.
Leica's first attempt at an M-series digital rangefinder was rough around the edges, but set a pattern for all of the cameras that came after it. In this week's Throwback Thursday article, Barney remembers the M8.
No stranger to extreme situations, legendary climber and filmmaker Jimmy Chin talks to Outside Magazine about his career, and the challenge of filming Alex Honnold's rope-free solo climb of El Capitain.
A company backed by Android co-founder Andy Rubin is attempting to make video conferencing less terrible.
Rangefinder magazine asked five professional portrait and wedding photographers about posting on Instagram; no surprise, they got five different answers.
This captivating stop motion film was created by stripping away one layer of wood at a time. It's hard to look away.