Engineers around the world are exploring all kinds of possibilities for wearable robot interfaces, from brain implants to touch interfaces. Professor Jacob Rosen and colleagues at the University of California, Santa Cruz's Bionics Lab have devised a robotic arm guided by the electrical signals sent by the brain through the nerves to contract the muscles. These signals can be read by electrodes affixed to the skin in key locations above the muscles.
Rosen says robot device operation via electromyograph (EMG) signals is advantageous for a number of reasons, including the fact that the method is less invasive and less costly than other sources located closer to the brain. In addition, EMG is a better alternative to simple touch interfaces because it can yield new insights about muscle physiology and improve people's ability to simulate and anticipate specific movements. "We are trying to allow a loose-leash situation by developing software that employs algorithms that emulate the muscle physiology, also known as a myoprocessor, to predict what a muscle is going to do before it has begun to do it," Rosen says. He is modeling the muscles to increase his wearable robots' responsiveness to human intentions, and also is using the robots to help study how human motion works, how it goes wrong, and the best way to fix such problems.
Rosen says medical applications, specifically rehabilitation, are his primary area of concentration. Rosen's EXO-UL7 robot arm can substitute for a physical therapist, permitting the remaining muscle control residing in a damaged arm to move the whole limb plus a load by compensating for gravity.
From CITRIS Newsletter
View Full Article
Abstracts Copyright © 2009 Information Inc., Bethesda, Maryland, USA
No entries found