Digital supercomputing can be expensive and energy-hungry, yet still it struggles with problems that the human brain tackles easily, such as understanding speech or viewing a photograph and recognizing what it shows. Even though artificial neural networks that apply deep learning have made much headway over the last few years, some computer scientists think they can do better with systems that even more closely resemble a living brain. Such neuromorphic computing, as this brain emulation is known, might not only accomplish tasks that current computers cannot, it could also lead to a clearer understanding of how human memory and cognition work. Also, if researchers can figure out how to build the machines out of analog circuits, they could run them with a fraction of the energy needed by modern computers.
"The real driver for neuromorphic computing is energy efficiency, and the current design space on CMOS isn't particularly energy efficient," says Mark Stiles, a physicist who is a project leader in the Center for Nanoscale Science and Technology at the U.S. National Institutes for Standards and Technology (NIST) in Gaithersburg, MD. Analog circuits consume less power per operation than existing complementary metal oxide semiconductor (CMOS) technologies, and so should prove more efficient. On the other hand, analog circuits are vulnerable to noise, and the technologies for building them are not as advanced as those for CMOS chips.
No entries found