A study by scientists at Germany’s Heidelberg University and the Max Planck Institute for Dynamics and Self-Organization reveals how "critical states" can be used to optimize artificial neural networks running on brain-inspired neuromorphic hardware.
Critical states are the points at which systems can quickly and fundamentally change their overall characteristics.
Although they are widely assumed to be optimal for computation in recurrent neural networks, the researchers found that criticality is not beneficial for every task.
In an experiment performed on a prototype of the analog neuromorphic BrainScales-2 chip, the researchers found that changing input strength permits easy adjustment of the distance to criticality.
They also showed a clear relationship between criticality and task performance, finding that only complex, memory-intensive tasks benefited from criticality.
From HPCwire
View Full Article
Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA
No entries found