Advanced algorithms could help make the speed and accuracy of clinically viable prosthetic devices more comparable to a healthy human arm. Engineers from Cambridge University have teamed up with neuroscientists at Stanford University to develop intelligent algorithms for decoding neural activity into physical commands.
Cambridge professor Zoubin Ghahramani describes neurons as noisy information channels, but notes that neural prosthetic designers are using fairly simple linear methods for decoding activities. "So you get activity from many, many neurons spiking and it is a challenge to infer the desired action and direction of movement," he says.
The researchers will bring more advanced machine-learning methods with adaptability to changing electrode recordings. They plan to test the algorithm in neural prosthetic devices implanted in primates before human trials. "The field of neural implants is moving quite rapidly but the idea of having brain signals control previously paralyzed bodies will take a bit longer," Ghahramani says.
From The Engineer (U.K.)
View Full Article
Abstracts Copyright © 2010 Information Inc., Bethesda, Maryland, USA
No entries found