acm-header
Sign In

Communications of the ACM

ACM News

Giving Robots Human-Like Perception of Their Physical Environments


View as: Print Mobile App Share:
A multi-frame action sequence of a human in motion.

Massachusetts Institute of Technology engineers have developed a representation of spatial perception for robots that is modeled after the way humans perceive and navigate the world.

Credit: Luca Carlone et al.

"Alexa, go to the kitchen and fetch me a snack"

Wouldn't we all appreciate a little help around the house, especially if that help came in the form of a smart, adaptable, uncomplaining robot? Sure, there are the one-trick Roombas of the appliance world. But MIT engineers are envisioning robots more like home helpers, able to follow high-level, Alexa-type commands, such as "Go to the kitchen and fetch me a coffee cup."

To carry out such high-level tasks, researchers believe robots will have to be able to perceive their physical environment as humans do.

"In order to make any decision in the world, you need to have a mental model of the environment around you," says Luca Carlone, assistant professor of aeronautics and astronautics at MIT. "This is something so effortless for humans. But for robots it's a painfully hard problem, where it's about transforming pixel values that they see through a camera, into an understanding of the world."

Now Carlone and his students have developed a representation of spatial perception for robots that is modeled after the way humans perceive and navigate the world.

 

From SciTechDaily
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account