acm-header
Sign In

Communications of the ACM

ACM TechNews

The Human Touch, in Robots


View as: Print Mobile App Share:
Haizhou Li with OLIVIA robot

Haizhou Li of A*STAR Research is upgrading OLIVIA to learn from speakers and to understand naturally spoken queries.

Credit: A*STAR

Researchers at the A*STAR Institute of High Performance Computing are developing robots that can respond directly to human speech. "When robots come to live in a human space, we need to take care of many more things than for manufacturing robots installed on the factory floor," says A*STAR researcher Haizhou Li.

Li leads the ASORO program, which has developed seven robots since 2008, including a robot butler, a robot home assistant, and OLIVIA, a robot receptionist that also serves as a research platform for studying social robotics technologies. OLIVIA can track human faces, eyes, and lip movements, as well as locate the source of human speech using head-mounted microphones. Li wants to upgrade OLIVIA to enable it to learn from speakers and to be capable of making taxi reservations and to shake hands. "What we are investigating now is how we can take context into account, and how behaviors can develop over time," says A*STAR's Martin Saerbeck, who is developing a robotic tutor that focuses on teaching children vocabulary.

In a recent study, two groups of children studied with two different versions of the iCAT robot, one more socially responsive than the other. The children with the more social iCAT robot scored significantly higher and showed higher intrinsic and task motivation than those children working with the standard teaching-style iCAT robot.

From A*STAR Research
View Full Article

 

Abstracts Copyright © 2011 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account