acm-header
Sign In

Communications of the ACM

ACM TechNews

Robot Learns to Smile and Frown


View as: Print Mobile App Share:
UCSD's Einstein robot

The Einstein robot head at UC San Diego performs asymmetric random facial movements as a part of the expression learning process.

Credit: Calit2

Researchers at the University of California, San Diego (UCSD) have developed an Einstein robot "empowered" to learn to smile and frown realistically via machine learning. The robot's head features 30 facial muscles, each propelled by a minuscule servo motor.

Developmental psychologists theorize that babies use systematic exploratory movements to learn to control their bodies, which initially seem random. "We applied this same idea to the problem of a robot learning to make realistic facial expressions," says Javier Movellan with UCSD's Machine Perception Laboratory.

The UCSD researchers commenced the learning process by directing the Einstein robot to twist and turn its face in all directions while it saw its reflection in a mirror and analyzed its own expressions using facial expression detection software. This generated the data needed for machine learning algorithms to learn to map between facial expressions and the movements of the muscle motors. The robot was able to determine the relationship between facial expressions and corresponding muscle movements, and then could produce expressions it had not encountered.

Machine Perception Laboratory researchers are studying the Einstein robot's face and head to find ways to automate the process of teaching robots to make realistic facial expressions.

From UCSD News
View Full Article

 

Abstracts Copyright © 2009 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account