acm-header
Sign In

Communications of the ACM

ACM TechNews

Brain-Inspired Algorithm Helps AI Systems Multitask and Remember


View as: Print Mobile App Share:
The University of Chicago neuroscientists found that adapting a brain mechanism can improve artificial neural networks' ability to learn multiple tasks.

A new study by University of Chicago neuroscientists found that adapting a well-known brain mechanism can dramatically improve the ability of artificial neural networks to learn multiple tasks and avoid catastrophic forgetting.

Credit: istockphoto.com

Researchers at the University of Chicago have found that adapting a well-known brain mechanism can dramatically improve the ability of artificial neural networks to learn multiple tasks, while avoiding "catastrophic forgetting," a persistent challenge in artificial intelligence (AI) studies.

The project serves as an example of how neuroscience research can inform new computer science strategies, and how AI technology can help scientists better understand the human brain.

The new algorithm, called "context-dependent gating," enables single artificial neural networks to learn and perform hundreds of tasks with only minimal loss of accuracy.

Said the University of Chicago's Nicolas Masse, "With this method, a fairly medium-sized network can be carved up a whole bunch of ways to be able to learn many different tasks if done properly."

From University of Chicago
View Full Article

 

Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account