Brown University professor Chad Jenkins and his team have developed a robot capable of holding a conversation, gesturing, and following a human's movement without using remote control devices. "We need robots that can adapt over time, that can respond to human commands and interact with humans," says Jenkins, the director of Brown's Robotics, Learning, and Autonomy (RLAB) group.
One of RLAB's projects uses robotic soccer and a Nintendo Wii remote to enable users to control robots in the game from the robot's perspective. "The player sees what the robot sees, and decides what it should do in a given situation," Jenkins says. "The person knows what he wants the robot to do, yet the robot's control policy — the entity that makes decisions for it — may not be capable of reflecting that." The input from the human players is used to refine the robot control policy, helping the robot to improve its locomotion and manipulation skills.
In another RLAB project, the objective is make robots more closely reflect the will and behavior of humans. Using a NASA humanoid upper-body robot, the researchers are using motion-capture systems to record human movement in three dimensions and translate that movement into digital models that can be used to create a more effective robot control policy. The new policy has enabled the robot to replicate basic human motion and manipulate objects. Jenkins also is developing interfaces that could be used with a neural cursor control system developed by Brown neuroscience professor John Donoghue.
View a video of Brown's interactive robot guided by sensors.
From Futurity.org
View Full Article
Abstracts Copyright © 2009 Information Inc., Bethesda, Maryland, USA
No entries found