acm-header
Sign In

Communications of the ACM

ACM TechNews

Drone Lighting: Autonomous Vehicles Could Automatically Assume the Right Positions for Photographic Lighting


View as: Print Mobile App Share:
A robot helicopter.

In the researchers' experiments, the robot helicopter was equipped with a continuous-light source, a photographic flash, and a laser rangefinder.

Credit: Manohar Srikanth/Frdo Durand/Kavita Bala

A team of researchers at the Massachusetts Institute of Technology (MIT) and Cornell University are developing algorithms that enable photographers to use camera-mounted controls to guide drone-mounted lights into just the right position for the perfect shot.

The team started by tackling a major challenge: developing algorithms that would enable the airborne lighting robots to create a lighting effect known as "rimming lighting" in which only the edge of the subject is strongly lit. The photographer starts by positioning the drone until the desired effect is achieved, and then algorithms take over that measure light on the subject and use images transmitted by the photographer's camera and data from the drone's ranging systems to change its position to maintain the lighting effect as the subject, photographer, or both move around.

The researchers tested the technique in a motion-capture lab and found it worked very well. Manohar Srikanth, who worked on the project as a graduate and post-doctoral student at MIT, says the precision of the rim lighting effect means the technique should generalize to less complicated lighting effects.

The team will present its current prototype at the International Symposium on Computational Aesthetics in Graphics, Visualization, and Imaging in August.

From MIT News
View Full Article

 

Abstracts Copyright © 2014 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account