acm-header
Sign In

Communications of the ACM

ACM TechNews

Using Ears, Not Just Eyes, Improves Robot Perception


View as: Print Mobile App Share:
square tray attached to the arm of a Sawyer robot

Researchers built a dataset by recording audio and video of 60 common objects as they rolled around a tray.

Credit: Carnegie Mellon University

Carnegie Mellon University researchers conducted a large-scale study of interactions between sound and robotic action and determined that sounds can help robots distinguish between objects and identify specific sound-causing actions.

The team compiled a dataset from simultaneous video and audio recordings of 60 common objects as they slid or rolled around a tray attached to a robot arm and crashed into its sides, cataloging 15,000 interactions in all. The researchers also collected data by having the robot arm push objects along a surface. They learned, for example, that a robot could use knowledge gleaned from the sound of one set of objects to predict the physical properties of previously unseen objects.

Robots that used sound were able to successfully classify objects 76% of the time.

From Carnegie Mellon University
View Full Article

 

Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account