acm-header
Sign In

Communications of the ACM

ACM TechNews

Ais Are Starting to Learn Like Human Babies By Grasping and Poking Objects


View as: Print Mobile App Share:
A robot touching and pushing objects to learn them.

A project at Carnegie Mellon University is striving to teach artificial intelligences the same way human children learn.

Credit: Jason Lee/Reuters

A project at Carnegie Mellon University (CMU) could enable artificial intelligences (AIs) to learn in a more human way.

The CMU researchers note babies learn by poking and pushing, and say their goal is to use physical robotic interactions to teach an AI to recognize objects. The team programmed a robotic arm to grasp, push, poke, and perceive an object from multiple angles, and they enabled it to interact with 100 objects and collected 130,000 data points.

The researchers fed the data into a convolutional neural network to train it to learn a visual representation of each of the 100 objects. The neural network was able to more accurately classify images of the objects on the ImageNet research database with the touch data than without it.

"The overall idea of robots learning from continuous interaction is a very powerful one," says University of Washington professor Sergey Levine. "Robots collect their own data, and in principle they can collect as much of it as they need, so learning from continuous active interaction has the potential to enable tremendous progress in machine perception, autonomous decision-making, and robotics."

From Quartz
View Full Article

 

Abstracts Copyright © 2016 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account