Emmanuel Lubezki: 'Digital gave me something I could never have done on film'
Part II (to read Part I of our interview click here)
DPR: You’ve talked about testing your cameras to understand what results they can deliver. How do you evaluate those results?
EL: I don't know if you've ever been in a color timing suite, like Technicolor, but when you go there it's very easy to really quickly assess how the cameras are behaving and how the sensors are behaving, and what you're getting and what you’re not getting. It's like a massive Photoshop.
DPR: What do you recommend for filmmakers or photographers that don’t have access to a facility like that?
EL: I would recommend that anyone do some little tests. Even if you're a complete amateur it would be great to take a few photos with your cameras and underexpose and overexpose, then go try to print and see what looks better.
DPR: Let’s go back to InVisage and QuantumFilm, which we're also excited about. In your opinion what potential does this technology have to change digital capture? Are you in this because you see it one day being in high-end professional, larger sensor cameras?
EL: The moment I met with InVisage we started talking about what their objectives were, what their idea of what a good image is, what a good capture is, and so on. As I told you before, it's like music to my ears. I'm very, very excited about the possibility of using the technology in very high end digital cameras, where I can do a movie like The Revenant and really be able to capture those 88 keys of the piano as opposed to much less.
DPR: Or maybe even 100 keys!
EL: That's right. Even more! Why not?
I'm also excited at every level because I'm a photographer. I would love to have a miniature camera so I don't have to carry my big Nikon, but a smaller camera that has this technology that can capture high dynamic range. And also on my phone. That would be a dream!
DPR: Another area where we’re seeing a lot of advances in is autofocus for video, thanks to technologies like on-sensor phase detect autofocus. It's getting so good that we've seen cameras that can track faces in video with no hunting, just refocusing very decisively. Where you can control the rate of refocusing as well. Would you ever trust a machine to autofocus?
EL: You know, I would. But unfortunately in film, like in The Revenant, when doing a very long take I have a human doing that job. And it's not only a human, it's an artist, and it's somebody that's very aware of everything - of the internal rhythm of the image and how it affects the audience. What happens is that sometimes those decisions are like in music, like when the drummer is playing slightly behind the beat to make it more jazzy or sensual, or playing slightly ahead of the beat to give the audience a different feeling to the music [syncopation]. It's the same with focus. It would be great to have an assist that would help get the focus exactly where you want it, but I think in cinema it's always going to require a human making decisions.
DPR: That's a very interesting perspective, because the same thing happens with composition and movement and motion in your films. You control the motion of the camera to elicit a feel in the viewer, and your camera is exploring the scene as a human would take in a scene. It slowly looks around, it finds action, it concentrates on it, it finds action somewhere else and concentrates on that. It's very compelling because you're making the viewer feel right there like they're exploring the scene. With that in mind, what are your thoughts on 360 degree image capture and virtual reality (VR)? Where do you see it in the context of professional filmmaking?
EL: I'm very excited about VR. One of the big issues we have in VR right now is that the cameras are very primitive, and one of the biggest issues is actually the dynamic range of those cameras.
Lubezki is a master of capturing a sense of place, as in this shot of Leonardo DiCaprio in The Revenant.
Courtesy Twentieth Century Fox
DPR: With VR the viewer has the freedom to look around on their own. Do you want them to have that freedom?
EL: Not for everything, but the idea is that with different story-telling tools, we will be able to guide the viewer to what we want them to see. The great thing about VR is that the grammar hasn’t been invented yet, though a lot of people are working on VR projects. The questions you're asking are the key questions. How much time can you submit the audience to VR? How fast can you move the camera before they get sick? How close can the subjects be? How can you guide the audience through the story? Should we use 360º or 270º or only a few degrees? All that is yet to be written, but generally I think VR and cinema are going to live together. I don't think one is going to take over the other one. It's just another way to tell stories, and I think that's fantastic.
DPR: And with Lytro's 'Light Field Volume' capture, you can capture not only multiple degrees of freedom, but also multiple perspectives and focal planes. Capture all that data and choose how to direct the viewer in post.
And I think that is very exciting too. Light field photography, and even light field projection is very exciting.* I haven't seen any examples, but the theory is absolutely beautiful.
DPR: One of the reasons we find your work inspiring is your use of wide angle lenses to create these immersive experiences and lend a certain depth to your imagery. How do you control distortion in your images, or how do you use wide angles to craft a look?
EL: I have a bunch of different lenses that I like, and all of them behave differently. All of them have different distortions and different colors and different contrast. I use them depending on the subject and the environment and the light. In The Revenant I ended up using a lot of lenses called Master Primes. These are very sharp, very hard, very clean lenses. And even though our 'normal' lens was the 14mm, it was a lens that distorts very little for what it is. Fifteen years ago nobody would have even thought about using a 14mm lens unless it was for a music video or something reminiscent of a dream. These lenses are now so good, though, that they barely distort.
Obviously, when you put Leo very close to the camera there’s some [perspective] distortion on his face, but that's also something we wanted because we didn't want Leo to look like a beautiful young man. We wanted Leo to look like a trapper from the 1820s. We get all these advantages by using these lenses, and of course the biggest advantage is how immersive they are because you can have your foreground subject and your background environment present at all times. That makes the image very immersive. At least that was the theory!
When we're doing a close-up of Leo the camera is probably just a few inches from his face, but the lens is so wide you can still see his whole face. By being so close you start to capture things like his breath and his sweat, his blood - it becomes a very visceral experience.
DPR: It seemed as if in some of the scenes the distortion added to the anguish because he was suffering and it was distorted.
EL: You're absolutely right! When we wanted more distortion I put a diopter in front of the lens. With that diopter you lose a lot of depth of field and the lens is even more distorted. There were five or six times in the movie where I wanted the image to be more distorted, almost to feel as if the camera was feeling Leo's angst and pain.
I don't know if it worked, these are just theories!
Using wide angle Master Prime lenses allowed Lubezki to get close to Leonardo DiCaprio while creating a sense of being immersed in the cold, arctic environment.
Courtesy Twentieth Century Fox
DPR: It worked, because we felt it before you even told us that was your intent! One final area we’d like to ask you about is the need for standards. As an artist, is it sometimes difficult to deal with the fact that you don't know how the end viewer is going to see the image. In a theater? On a cell phone? On a poorly calibrated TV? How do you deal with that?
EL: It's a nightmare. It's always been a nightmare. On film it was a nightmare, but I knew that people were going to watch the movie on film so I tried to watch as many prints as possible before they were sent around the world. Sometimes I would be traveling and peek in a theater and realize that the print was not bad but that the projector was running with half the light so it looked murky and mushy, so it's always been very hard.
Right now the biggest issue for me is TVs. I think we're living in the worst moment for TV. The manufacturers have gone insane and make all these TVs with brightness for sports and reality TV. That’s the biggest fight that most filmmakers have now. How can we create a standard for the TV industry so that you can press a button and watch a movie in a movie standard? I'm watching the Godfather, which is the greatest film ever shot, and it looks like a soap opera because it has all this banding and clipping and artifacts and it looks awful. That should be the biggest conversation - the Academy and everyone should get together with the makers of TVs and there should be a standard. But I also think that TVs should be 12 or 10-bit [not 8-bit].
DPR: Where do you see technologies like high dynamic range TV coming into the picture?
EL: It's fantastic. I've been working in high dynamic range and it's incredible! It's much better than anything we've seen. To start, there's black. It's very hard to do photography if you don't have black and you don't have white. A TV that has OLED, or the new HDR monitors, are fantastic because the blacks look black. Then you have all these different gamuts, like dark grays that you didn't have before. And the higher bit-depth is almost perfect in the sense that there's no banding.
DPR: With the increases in color gamut, and HDR brightness and contrast, it seems like there’s really a need for some standards.
EL: This is a case where everybody needs to get together and they have to create a standard. I was lucky on The Revenant because I got to make a ‘normal’ TV pass, an HDR TV pass, an IMAX pass, a Dolby pass, a cinema pass... so I had all these different color timing grades for the movie. But not all filmmakers have that. Often, filmmakers only have one pass and the rest is done by numbers or not even done.
My feeling is that the TVs and monitors are the biggest issues. It would be great to have 16-bit or 12-bit monitors with a little chip that tells you "This is the standard the filmmaker picked, and this is the LUT the filmmaker wants to use to show their work."
DPR: The grading/timing you do should take into account the monitor you're using, then translate the grade automatically for any viewing device, using a profile for that viewing device, to match the artist's intent?
Exactly. Right now I have to do passes for everything. When I was doing what we call the 'normal' TV pass I had a plasma TV in front of me, as well as all sorts of other TVs. It looked like a Best Buy! The movie looked completely different on each one. It was heartbreaking.
DPR: Thanks so much for joining us today. Best of luck at the Oscars next week!
EL: Thank you as well!
* Although light field, as popularized by Lytro, is often associated with focus, perspective, and depth-based editing in post-processing, light field displays have the potential of glasses-free 3D and automatic perspective shifts based on your viewing perspective and location.
|Steamin' Mad by ahrensjt|
from Angered Subjects (Street Photography)
|Smile by Olymguy|
from Ultra Asian Indian Female Faces
|Space Shuttle Cockpit- by vbuhay|
from Aircraft Control Stick
The Polish town of Katowice is not known as an area of beauty, but as all photographers know, that doesn't mean that beauty can't be found if you know where to look. Mariusz Pietranek used a drone to look down on the colorful sedimentation tanks at an ironworks.
New York Times video journalist Ben Solomon spent a harrowing three weeks accompanying Iraqi Major Sajjad al-Hour as he and his men fought to retake Mosul from I.S. forces.
The 3D VR camera launched through a crowdfunding campaign in 2015 goes on sale beginning June 26.
Noctilucent clouds, a crescent moon and Venus were visible in the pre-dawn sky over Budapest yesterday. Photographer György Soponyai captured NASA's astronomy picture of the day.
Squirming pets won't sit still for photos? A Kickstarter campaign is looking to help.
Find out how Chris Burkard shifted from editorial photography to his true passions: landscapes, conservation and, of course, surfing.
The updated EyeEm app scans your camera roll and picks images that are composed particularly well, have the best quality, or highest chance of selling on EyeEm Market.
It's three years old but still a solid option for a Micro Four Thirds shooter looking for a high-quality, fast, wide-angle prime. Take a look at how we got along with it.
Tamron has announced the longest all-in-one zoom lens currently available, the 18-400mm F3.5-6.3 Di II VC HLD. Designed for Canon and Nikon crop-sensor cameras, the lens will be available in July.
When you're ready to step-up to full-frame from an entry-level or midrange camera, the choices can be overwhelming. Find out which models came out on top in our $1200-2000 enthusiast ILC roundup.
Just a guy wearing a VR headset, smashing invisible Goombas in Central Park.
NASA's Mars Reconnaissance Orbiter captured this gorgeous aerial photo of the Martian landscape. And if you look really close, you can actually see the Mars Curiosity rover in the very middle.
The city of Laguna Beach, California has provided some clarification around the kinds of photography permits it offers.
Later this year, a VR180 camera will be Joining Yi's Halo and 360 VR cameras, which will offer stereo 3D capture, yet be as easy to use and compact as a 2D camera.
Caltech researchers have developed an 'optical phased array' chip that uses time delays instead of a lens to focus the incoming light.
Pricing and shipping have finally been revealed for two highly anticipated lenses from Sigma, announced in February.
These macro photos of clouds of paint billowing through clear water might look like high-quality CGI, but they're real photographs. And photographer Alberto Seveso told us how they were made.
Facebook is testing a feature that prevents people from saving, sharing, or even taking a screenshot of your profile picture.
We've reshot the Sony a9 in our studio. The short story: it's sharper! The long story... well you can read it all here.
The collection will be officially launched during the Europeana Transcribathon Campus Berlin 2017 crowdsourcing event which will be held on 22 and 23 June at the Berlin State Library.
Light gives us some insight into the preparations for the launch of the pre-order shipments of its much anticipated L16 multi-lens camera.
OnePlus co-founder Carl Pei has confirmed in a tweet that the second lens on the back of the OnePlus 5 uses a 1.6x optical zoom and that digital zoom is used to reach the claimed 2x zoom factor.
Fujifilm recently unveiled the second in its series of affordable cine lenses, the MK50-135mm T2.9. We got our hands on it for a couple days and took it for a spin.
Leica's first attempt at an M-series digital rangefinder was rough around the edges, but set a pattern for all of the cameras that came after it. In this week's Throwback Thursday article, Barney remembers the M8.
No stranger to extreme situations, legendary climber and filmmaker Jimmy Chin talks to Outside Magazine about his career, and the challenge of filming Alex Honnold's rope-free solo climb of El Capitain.
A company backed by Android co-founder Andy Rubin is attempting to make video conferencing less terrible.
Rangefinder magazine asked five professional portrait and wedding photographers about posting on Instagram; no surprise, they got five different answers.
This captivating stop motion film was created by stripping away one layer of wood at a time. It's hard to look away.
It will enable users to simulate the presence of the sun, moon and Milky Way and see how they interact with an area's topography.
Since its introduction in November last year Instagram's live streaming feature has been used by millions, but videos could not be archived for watching at a later stage. A new update has now added the capability.