The University of Illinois' Naresh Shanbhag is pushing for a new computer architecture that blends computing and memory so devices can be smarter without consuming more energy.
One group pursuing this architecture is run by Stanford University's Subhasish Mitra; they are layering carbon-nanotube integrated circuits atop resistive random-access memory (RAM). The researchers produced a demo showing their system could efficiently classify the language of a sentence.
Meanwhile, Shanbhag's group and others are using existing materials, employing analog control circuits enclosing arrays of memory cells in new ways. Instead of sending data to the processor, they program these circuits to run simple artificial intelligence algorithms in a "deep in-memory architecture."
Shanbhag believes conducting processing at the edges of subarrays is sufficiently deep to boost energy and speed without losing storage. His group produced a 10-fold improvement in energy efficiency and a fivefold improvement in speed when using analog circuits to detect faces in images stored in static RAM.
From IEEE Spectrum
View Full Article
Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA
No entries found