acm-header
Sign In

Communications of the ACM

ACM News

AI Overcomes Stumbling Block on Brain-Inspired Hardware


View as: Print Mobile App Share:

The BrainScaleS-2 neuromorphic chip, developed by neuromorphic engineers at Heidelberg University, uses tiny circuits that mimic the analog computing of the actual neurons in our brains.

Credit: Heidelberg University

Today's most successful artificial intelligence algorithms, artificial neural networks, are loosely based on the intricate webs of real neural networks in our brains. But unlike our highly efficient brains, running these algorithms on computers guzzles shocking amounts of energy: The biggest models consume nearly as much power as five cars over their lifetimes.

Enter neuromorphic computing, a closer match to the design principles and physics of our brains that could become the energy-saving future of AI. Instead of shuttling data over long distances between a central processing unit and memory chips, neuromorphic designs imitate the architecture of the jelly-like mass in our heads, with computing units (neurons) placed next to memory (stored in the synapses that connect neurons). To make them even more brain-like, researchers combine neuromorphic chips with analog computing, which can process continuous signals, just like real neurons. The resulting chips are vastly different from the current architecture and computing mode of digital-only computers that rely on binary signal processing of 0s and 1s.

 

From Quanta Magazine
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account