acm-header
Sign In

Communications of the ACM

ACM TechNews

AI Trained on Baby's Experiences Yields Clues to How We Learn Language


View as: Print Mobile App Share:
This 18-month-old baby is one of the children given a head-mounted camera to record their view of the world.

The research shows AI can pick up some basic elements of language from the sensory input of a single child’s experience, even without preexisting knowledge of grammar or other social abilities.

Credit: Wai Keen Vong

Researchers at New York University found that a simple AI program could learn basic elements of language from the sensory input of a child's experience.

The researchers used data from an Australian baby known only as Sam, who is now 11 years old, from the SAYCam database.

Trained on just 61 hours of footage of Sam, including 600,000 video frames paired with 37,500 transcribed words, the AI was able to match basic nouns and images on par with an AI trained on 400 million captioned-images.

From The Washington Post
View Full Article - May Require Paid Subscription

 

Abstracts Copyright © 2024 SmithBucklin, Washington, D.C., USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account