Drone lighting could be coming soon to your studio
A flying flash rig that tracks the position of both photographer and subject to maintain consistent lighting angles has been developed by researchers from Massachusetts Institute of Technology and Cornell University. In a project designed to test co-ordination between aerial robots and ground-based targets, researchers programmed a flash-carrying drone to light people in the studio as the subjects and the photographer changed positions.
|In the researchers' experiments, the robot helicopter was equipped with a continuous-light source, a photographic flash, and a laser rangefinder. Picture courtesy of the researchers.|
The team presented a prototype robot to the International Symposium on Computational Aesthetics in Graphics, Visualization and Imaging. The aim of the experiment, according to them is to provide photographers 'with squadrons of small, light-equipped autonomous robots that automatically assume the positions necessary to produce lighting effects specified through a simple, intuitive, camera-mounted interface.'
For the rim-lighting exercise the drone was programmed to position itself so that its light would create a certain thickness of highlight on the edges of the subject. Monitoring the position and stance of the subject via images streamed from the photographer's camera at a rate of 20 frames per second, the drone was able to not only follow the subject but also to detect the angle at which the subject was standing to the camera.
Researcher Manohar Srikanth explains 'If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he's looking 90 degrees away from you, then he's exposing his chest to the light, which means that you'll see a much thicker rim light. So in order to compensate for the change in the body, the light has to change its position quite dramatically.'
Rim lighting was chosen to test the system as it requires a greater degree of accuracy than some other forms of lighting, but the system will be able to illuminate a subject from any angle the photographer specifies for a range of lighting effects. Although some way off going into production, drone mounted lighting could see the end of the traditional studio stand, although perhaps the development of a silent drone will be needed unless the kit includes headache tablets for the model.
The idea may not be as revolutionary as these researchers think - photographers have been using voice-activated lighting supports in the studio and on location for many years. They're known as assistants.
Autonomous vehicles could automatically assume the right positions for photographic lighting
Larry Hardesty | MIT News Office
July 11, 2014 - Lighting is crucial to the art of photography. But lights are cumbersome and time-consuming to set up, and outside the studio, it can be prohibitively difficult to position them where, ideally, they ought to go.
Researchers at MIT and Cornell University hope to change that by providing photographers with squadrons of small, light-equipped autonomous robots that automatically assume the positions necessary to produce lighting effects specified through a simple, intuitive, camera-mounted interface.
At the International Symposium on Computational Aesthetics in Graphics, Visualization, and Imaging in August, they take the first step toward realizing this vision, presenting a prototype system that uses an autonomous helicopter to produce a difficult effect called "rim lighting," in which only the edge of the photographer's subject is strongly lit.
According to Manohar Srikanth, who worked on the system as a graduate student and postdoc at MIT and is now a senior researcher at Nokia, he and his coauthors - MIT professor of computer science and engineering Frédo Durand and Cornell's Kavita Bala, who also did her PhD at MIT - chose rim lighting for their initial experiments precisely because it's a difficult effect.
"It's very sensitive to the position of the light," Srikanth says. "If you move the light, say, by a foot, your appearance changes dramatically."
With the new system, the photographer indicates the direction from which the rim light should come, and the miniature helicopter flies to that side of the subject. The photographer then specifies the width of the rim as a percentage of its initial value, repeating that process until the desired effect is achieved.
Thereafter, the robot automatically maintains the specified rim width. "If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he's looking 90 degrees away from you, then he's exposing his chest to the light, which means that you'll see a much thicker rim light," Srikanth says. "So in order to compensate for the change in the body, the light has to change its position quite dramatically."
In the same way, Srikanth says, the system can compensate for the photographer's movements. In both cases, the camera itself supplies the control signal. Roughly 20 times a second, the camera produces an image that is not stored on its own memory card but transmitted to a computer running the researchers' control algorithm. The algorithm evaluates the rim width and adjusts the robot's position accordingly.
"The challenge was the manipulation of the very difficult dynamics of the UAV [unmanned aerial vehicle] and the feedback from the lighting estimation," Durand says. "That's where we put a lot of our efforts, to make sure that the control of the drone could work at the very high speed that's needed just to keep the thing flying and deal with the information from the lidar [the UAV's laser rangefinder] and the rim-lighting estimation."
As Srikanth explains, that required some algorithmic streamlining. "When we first started looking at it, we thought we'd come up with a very fancy algorithm that looks at the whole silhouette of the subject and tries to figure out the morphological properties, the curve of the edge, and so on and so forth, but it turns out that those calculations are really time-consuming," Srikanth says.
Instead, the algorithm simply looks for the most dramatic gradations in light intensity across the whole image and measures their width. With a rim-lit subject, most of those measurements will congregate around the same value, which the algorithm takes to be the width of the rim.
In experiments, this quick approximation was able to keep up with the motions of both the subject and the photographer while maintaining a consistent rim width.
The researchers tested their prototype in a motion-capture studio, which uses a bank of high-speed cameras to measure the position of specially designed light-reflecting tags with millimeter accuracy; several such tags were affixed to the helicopter.
But, Srikanth explains, the purpose of the tests was to evaluate the control algorithm, which performed well. Algorithms that gauge robots' location based only on measurements from onboard sensors are a major area of research in robotics, and the new system could work with any of them. Even rim lighting, Srikanth says, doesn't require the millimeter accuracy of the motion-capture studio. "We only need a resolution of 2 or 3 centimeters," he says.
"Rim lighting is a particularly interesting effect, because you want to precisely position the lighting to bring out silhouettes," says Ravi Ramamoorthi, a professor of computer science and engineering at the the University of California, San Diego. "Other effects are in some sense easier - one doesn't need as precise positioning for frontal lighting. So the technique would probably generalize to other light effects. But at the same time, as-precise control and manipulation may not be needed. Manual static positioning might be adequate."
"Clearly, taking the UAV system out of the lab and into the real world, and making it robust enough to be practical is a challenge," Ramamoorthi adds, "but also something that should be doable given the rapid advancement of all of these technologies."
Jun 20, 2017
Jun 19, 2017
Jun 23, 2017
Jun 19, 2017
|Hot Air Balloons Over Bagan by User9320321874|
|Blue mood by darub|
from Fixed lens shootout.
|Yellow Warbler by LeeS|
from A Big Year - birds
|Waiting for the Parade by tcoker1103|
from - La Vida Loca - (Black and White Street Photography+ A Border)
Nikon's 100th birthday party continues worldwide as a distributor in Italy organized a one-of-a-kind feat: assembling the world's largest 'human camera' from over a thousand volunteers.
Ricoh has dropped the price of its Theta SC 360 spherical camera by to $199, a reduction of roughly $50. The camera features two 12MP sensors and can record Full HD video in addition to stills.
Photojournalist Pete Souza served as the presidential photographer for both Ronald Reagan and Barack Obama. In an interview with fellow photographer Marcia Nighswander, he discusses several of his most noteworthy images.
Photographer Michael Wolf has been documenting the crowded conditions of Tokyo's subway trains since the 1990s. The photos have gone viral regularly in the years since he started the project, and he just published the final edition in the series.
The just-launched OnePlus 5 is getting a minor update that should improve camera function.
A Belgian camera shop is showing off an extremely rare, limited 'Rex Edition' Nikon D500. The cosmetic alterations were provided by a customer's German Shepherd Rex, who got ahold of the camera within a day of its purchase.
Adobe says that many of its users have been relying on SkyBox for VR editing and it therefore made sense to make the plug-ins available to all subscribers through Creative Cloud.
The Pictar grip provides a number of customizable physical controls for your iPhone camera, but at its price point we would like to see better materials and build quality.
Peak Design's 'consider every detail' approach shines in the Everyday Backpack. While expensive, it's one of the best options out there for a photographer who needs to pack a lot of stuff in addition to gear.
If you're thinking of using Canon's sports glass on the Sony a9, think again. The ultra-fast camera slows way down when you attach off-brand glass.
The Polish town of Katowice is not famed as an area of beauty, but as all photographers know, that doesn't mean that beauty can't be found if you look in the right places. Mariusz Pietranek used a drone to look down on the colorful sedimentation tanks at an ironworks.
New York Times video journalist Ben Solomon spent a harrowing three weeks accompanying Iraqi Major Sajjad al-Hour as he and his men fought to retake Mosul from I.S. forces.
The 3D VR camera launched through a crowdfunding campaign in 2015 goes on sale beginning June 26.
Noctilucent clouds, a crescent moon and Venus were visible in the pre-dawn sky over Budapest yesterday. Photographer György Soponyai captured NASA's astronomy picture of the day.
Squirming pets won't sit still for photos? A Kickstarter campaign is looking to help.
Find out how Chris Burkard shifted from editorial photography to his true passions: landscapes, conservation and, of course, surfing.
The updated EyeEm app scans your camera roll and picks images that are composed particularly well, have the best quality, or highest chance of selling on EyeEm Market.
It's three years old but still a solid option for a Micro Four Thirds shooter looking for a high-quality, fast, wide-angle prime. Take a look at how we got along with it.
Tamron has announced the longest all-in-one zoom lens currently available, the 18-400mm F3.5-6.3 Di II VC HLD. Designed for Canon and Nikon crop-sensor cameras, the lens will be available in July.
When you're ready to step-up to full-frame from an entry-level or midrange camera, the choices can be overwhelming. Find out which models came out on top in our $1200-2000 enthusiast ILC roundup.
Just a guy wearing a VR headset, smashing invisible Goombas in Central Park.
NASA's Mars Reconnaissance Orbiter captured this gorgeous aerial photo of the Martian landscape. And if you look really close, you can actually see the Mars Curiosity rover in the very middle.
The city of Laguna Beach, California has provided some clarification around the kinds of photography permits it offers.
Later this year, a VR180 camera will be Joining Yi's Halo and 360 VR cameras, which will offer stereo 3D capture, yet be as easy to use and compact as a 2D camera.
Caltech researchers have developed an 'optical phased array' chip that uses time delays instead of a lens to focus the incoming light.
Pricing and shipping have finally been revealed for two highly anticipated lenses from Sigma, announced in February.
These macro photos of clouds of paint billowing through clear water might look like high-quality CGI, but they're real photographs. And photographer Alberto Seveso told us how they were made.
Facebook is testing a feature that prevents people from saving, sharing, or even taking a screenshot of your profile picture.
We've reshot the Sony a9 in our studio. The short story: it's sharper! The long story... well you can read it all here.
The collection will be officially launched during the Europeana Transcribathon Campus Berlin 2017 crowdsourcing event which will be held on 22 and 23 June at the Berlin State Library.