Drone lighting could be coming soon to your studio
A flying flash rig that tracks the position of both photographer and subject to maintain consistent lighting angles has been developed by researchers from Massachusetts Institute of Technology and Cornell University. In a project designed to test co-ordination between aerial robots and ground-based targets, researchers programmed a flash-carrying drone to light people in the studio as the subjects and the photographer changed positions.
|In the researchers' experiments, the robot helicopter was equipped with a continuous-light source, a photographic flash, and a laser rangefinder. Picture courtesy of the researchers.|
The team presented a prototype robot to the International Symposium on Computational Aesthetics in Graphics, Visualization and Imaging. The aim of the experiment, according to them is to provide photographers 'with squadrons of small, light-equipped autonomous robots that automatically assume the positions necessary to produce lighting effects specified through a simple, intuitive, camera-mounted interface.'
For the rim-lighting exercise the drone was programmed to position itself so that its light would create a certain thickness of highlight on the edges of the subject. Monitoring the position and stance of the subject via images streamed from the photographer's camera at a rate of 20 frames per second, the drone was able to not only follow the subject but also to detect the angle at which the subject was standing to the camera.
Researcher Manohar Srikanth explains 'If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he's looking 90 degrees away from you, then he's exposing his chest to the light, which means that you'll see a much thicker rim light. So in order to compensate for the change in the body, the light has to change its position quite dramatically.'
Rim lighting was chosen to test the system as it requires a greater degree of accuracy than some other forms of lighting, but the system will be able to illuminate a subject from any angle the photographer specifies for a range of lighting effects. Although some way off going into production, drone mounted lighting could see the end of the traditional studio stand, although perhaps the development of a silent drone will be needed unless the kit includes headache tablets for the model.
The idea may not be as revolutionary as these researchers think - photographers have been using voice-activated lighting supports in the studio and on location for many years. They're known as assistants.
Autonomous vehicles could automatically assume the right positions for photographic lighting
Larry Hardesty | MIT News Office
July 11, 2014 - Lighting is crucial to the art of photography. But lights are cumbersome and time-consuming to set up, and outside the studio, it can be prohibitively difficult to position them where, ideally, they ought to go.
Researchers at MIT and Cornell University hope to change that by providing photographers with squadrons of small, light-equipped autonomous robots that automatically assume the positions necessary to produce lighting effects specified through a simple, intuitive, camera-mounted interface.
At the International Symposium on Computational Aesthetics in Graphics, Visualization, and Imaging in August, they take the first step toward realizing this vision, presenting a prototype system that uses an autonomous helicopter to produce a difficult effect called "rim lighting," in which only the edge of the photographer's subject is strongly lit.
According to Manohar Srikanth, who worked on the system as a graduate student and postdoc at MIT and is now a senior researcher at Nokia, he and his coauthors - MIT professor of computer science and engineering Frédo Durand and Cornell's Kavita Bala, who also did her PhD at MIT - chose rim lighting for their initial experiments precisely because it's a difficult effect.
"It's very sensitive to the position of the light," Srikanth says. "If you move the light, say, by a foot, your appearance changes dramatically."
With the new system, the photographer indicates the direction from which the rim light should come, and the miniature helicopter flies to that side of the subject. The photographer then specifies the width of the rim as a percentage of its initial value, repeating that process until the desired effect is achieved.
Thereafter, the robot automatically maintains the specified rim width. "If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he's looking 90 degrees away from you, then he's exposing his chest to the light, which means that you'll see a much thicker rim light," Srikanth says. "So in order to compensate for the change in the body, the light has to change its position quite dramatically."
In the same way, Srikanth says, the system can compensate for the photographer's movements. In both cases, the camera itself supplies the control signal. Roughly 20 times a second, the camera produces an image that is not stored on its own memory card but transmitted to a computer running the researchers' control algorithm. The algorithm evaluates the rim width and adjusts the robot's position accordingly.
"The challenge was the manipulation of the very difficult dynamics of the UAV [unmanned aerial vehicle] and the feedback from the lighting estimation," Durand says. "That's where we put a lot of our efforts, to make sure that the control of the drone could work at the very high speed that's needed just to keep the thing flying and deal with the information from the lidar [the UAV's laser rangefinder] and the rim-lighting estimation."
As Srikanth explains, that required some algorithmic streamlining. "When we first started looking at it, we thought we'd come up with a very fancy algorithm that looks at the whole silhouette of the subject and tries to figure out the morphological properties, the curve of the edge, and so on and so forth, but it turns out that those calculations are really time-consuming," Srikanth says.
Instead, the algorithm simply looks for the most dramatic gradations in light intensity across the whole image and measures their width. With a rim-lit subject, most of those measurements will congregate around the same value, which the algorithm takes to be the width of the rim.
In experiments, this quick approximation was able to keep up with the motions of both the subject and the photographer while maintaining a consistent rim width.
The researchers tested their prototype in a motion-capture studio, which uses a bank of high-speed cameras to measure the position of specially designed light-reflecting tags with millimeter accuracy; several such tags were affixed to the helicopter.
But, Srikanth explains, the purpose of the tests was to evaluate the control algorithm, which performed well. Algorithms that gauge robots' location based only on measurements from onboard sensors are a major area of research in robotics, and the new system could work with any of them. Even rim lighting, Srikanth says, doesn't require the millimeter accuracy of the motion-capture studio. "We only need a resolution of 2 or 3 centimeters," he says.
"Rim lighting is a particularly interesting effect, because you want to precisely position the lighting to bring out silhouettes," says Ravi Ramamoorthi, a professor of computer science and engineering at the the University of California, San Diego. "Other effects are in some sense easier - one doesn't need as precise positioning for frontal lighting. So the technique would probably generalize to other light effects. But at the same time, as-precise control and manipulation may not be needed. Manual static positioning might be adequate."
"Clearly, taking the UAV system out of the lab and into the real world, and making it robust enough to be practical is a challenge," Ramamoorthi adds, "but also something that should be doable given the rapid advancement of all of these technologies."
|Nectar Dancing by Lensmate|
from A Big Year - birds
|Sad clown by PEB|
|Mtl Gen X 2015 DP by MarioSS|
from - Gen X - (In Full Colours+ Border)
Go behind the scenes with National Geographic photographer Renan Ozturk and see what it takes to capture a dangerous, harrowing, stunning Nat Geo photo essay.
Erez Marom tells the story behind this ominous photo of the sand 'reaching up' towards the mountains at Skagsanden beach in Norway. He calls this photo 'Torment.'
DPReview staffer Carey Rose has taken the Panasonic Leica DG 15mm F1.7 along for everything from a city-side boat ride to a bachelor party across the mountains. Find out how the little Leica fared.
Canon just unveiled the largest 12-ink printer on the market. The new imagePROGRAF PRO-6000 printer can make prints from 17 all the way up to 60 inches wide.
"Standing in one of the holiest places on earth, I felt uneasy," writes Wired's Jason Parham. "Most of my fellow visitors, I realized with a brief bloom of nausea, were taking selfies."
Christopher Nolan's Dunkirk has been receiving great reviews, but it's a challenge to see it in its full glory. This handy infographic reveals the aspect ratio chaos that is wrought as the industry retreats from film.
Anti-bullying organization Ditch the Label's Annual Bullying Survey 2017 reveals yet again that Instagram, more so than any other social network, has a the worst effect on youth mental health.
It's been a crazy day for innovative patent news. Apparently Sony is thinking of developing a medium format curved sensor camera.
An update to the Silkypix Raw converter fixes some bugs and adds support for several popular new cameras.
This crazy custom-built underwater camera shoots 8x10 large format film. It's supposedly "the first successful underwater 8x10 ever made," and it can be yours for $5,800... plus shipping.
Blackmagic just reveled a new accessory for their Cintel Film Scanner. The Cintel Audio and KeyKode Reader can capture KeyKode data and high-quality audio from film in real-time as it is being scanned.
A new Nikon patent shows a lens designed for a curved full-frame sensor. Could this be the high-end Nikon mirrorless camera people are hoping for?
The ability to shoot images at 1,000 fps first appeared in a Sony smartphone sensor. Now the Japanese manufacturer is using the same feature for industrial applications.
Astronomy expert and photographer Dr. Tyler Nordgren thinks you should "see your first eclipse, photograph your second." But if you do plan on taking photos this August, here are a few tips from someone who's been there.
How confident are you that you can spot a manipulated photo? A recent study at the University of Warwick shows that many people are pretty bad at it.
If you purchased a Leica TL2, do NOT attach Leica's Visoflex electronic viewfinder. Leica is working on a fix, but for now, it's possible the viewfinder will break your camera.
Google just released Motion Stills for Android. Unlike the iOS version, the Android app uses a redesigned video processing pipeline that processes each frame of a video as it is being recorded, creating instant results.
A huge copyright lawsuit between photography firm VHT and Zillow Group is heating up again, as both sides appeal a court ruling that granted VHT $4 million in damages.
European Space Agency astronaut Thomas Pesquet spent 6 months on board the International Space Station where he worked with Google capturing spheric panorama images that are now available in Street View.
It's official. PDN has confirmed with parent company Aurelius that 94-year-old lighting company Bowens is indeed going out of business.
The newly launched firmware version 1.06 fixes AF-issues that can occur with some lenses that are not officially compatible with the MC-11 converter.
Voyager is a waterproof smart light stick you can control entirely from your phone. The light has already blown past its $300K funding goal on Indiegogo.
2018 is the last year Photokina will take place during the traditional end-of-September dates. In 2019, Photokina will take place from the 8th to the 11th of May.
The Canon IXUS 50 (known as the SD400 Digital ELPH in North America) was one of a string of high-performing, pocketable PowerShots of the mid-2000s. In this week's throwback Thursday, Barney casts his mind back to 2005.
A close look at the EOS 6D II's Raw files suggest its dynamic range has taken a significant step backwards compared with the company's recent DSLRs. We look at how much difference this might make for your photos.
With a full-production review unit in our hands, we've got over 100 production samples from the new Canon EOS 6D Mark II to share.
Need a break from your day? Kick back and watch the making of a somewhat unconventional mojito filmed on Canon's new EOS 6D Mark II.
The Bonfoton Camera Obscura Room Lens can turn any room into a camera obscura, projecting the view from your window onto the walls of your room.
Adobe just released version 2015.12 of Lightroom CC, adding support for several new cameras and lenses, and baking in several important bug fixes while they were at it.
In this interview, Chiara Marinai, photo editor for VanityFair.com, explains exactly what she looks for in new photographers and photo submissions. Take notes.