Carnegie Mellon University Ph.D. student Chris Harrison and Microsoft researchers Desney Tan and Dan Morris have developed Skinput, technology that combines bio-acoustic sensors and machine-learning programs to enable people to use parts of the body as touchpads to control mobile devices.
The researchers say Skinput could help people take advantage of the computer power that is currently available in compact devices that can be worn or carried. "With Skinput, we can use our own skin—the body's largest organ—as an input device," Harrison says.
In a prototype, acoustic sensors, which capture sound generated by actions such as flicking or tapping fingers together, are attached to the upper arm. The researchers found that the tap of each fingertip produces a unique acoustic signature that machine-learning programs could learn to identify. In a test involving 20 subjects, the system was able to identify inputs with 88 percent accuracy. The prototype armband features a sensor array and a small projector that can superimpose buttons onto the user's forearm, as well as create a keypad on the palm of the hand.
View a video of Chris Harrison describing and demonstrating Skinput.
From Carnegie Mellon News
View Full Article
Abstracts Copyright © 2010 Information Inc., Bethesda, Maryland, USA
No entries found