University of Michigan (UM) researchers have developed a model that simplifies robots' ability to follow commands by allowing them to understand what people are saying.
Many robots utilize simultaneous localization and mapping (SLAM) to know their whereabouts, concurrently tracking their location on a map and updating their environmental knowledge.
UM's Jason Corso and colleagues remote-controlled a robot around a tabletop maze arranged in 116 configurations without being able to see the labyrinth, while a natural language processing model associated the navigator's commands to the driver.
Once the language dataset was compiled, the models that parse them were trained under simulation, and learned to follow plain-text commands.
Corso said, "The challenge for humans to interact with SLAM-based machines is we need to think on their terms. It's really rigid and we have to adapt to the robots. The goal is to flip that and have the robot adapt to the human language."
From New Scientist
View Full Article
Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA
No entries found