"Alexa, go to the kitchen and fetch me a snack"
Wouldn't we all appreciate a little help around the house, especially if that help came in the form of a smart, adaptable, uncomplaining robot? Sure, there are the one-trick Roombas of the appliance world. But MIT engineers are envisioning robots more like home helpers, able to follow high-level, Alexa-type commands, such as "Go to the kitchen and fetch me a coffee cup."
To carry out such high-level tasks, researchers believe robots will have to be able to perceive their physical environment as humans do.
"In order to make any decision in the world, you need to have a mental model of the environment around you," says Luca Carlone, assistant professor of aeronautics and astronautics at MIT. "This is something so effortless for humans. But for robots it's a painfully hard problem, where it's about transforming pixel values that they see through a camera, into an understanding of the world."
Now Carlone and his students have developed a representation of spatial perception for robots that is modeled after the way humans perceive and navigate the world.
From SciTechDaily
View Full Article
No entries found