acm-header
Sign In

Communications of the ACM

ACM TechNews

New Interactive System Detects Touch and Gestures on Any Surface


View as: Print Mobile App Share:
Touch-activated system

This composite image shows how fingers and hands are computed in a new touch-activated system that projects onto walls and other surfaces and allows people to interact with their environment and each other.

Credit: Purdue University

Purdue University researchers have developed an extended multitouch system that enables more than one person to simultaneously use a computing surface.

The system can project computer images onto walls and other surfaces and allows users to interact with their environment and each other. The system also can identify the fingers of a user's hand while touching any plain surface, as well as hand positions and gestures. The researchers found that the system is 98 percent accurate in determining hand posture.

"You could use it for living environments, to turn appliances on, in a design studio to work on a concept, or in a laboratory, where a student and instructor interact," says Purdue professor Karthik Ramani.

The system relies on the Microsoft Kinect camera to sense three-dimensional space. "The camera sees where your hands are, which fingers you are pressing on the surface, tracks hand gestures, and recognizes whether there is more than one person working at the same time," Ramani says. The camera and hand model enables the system to locate the center of each hand, which is required for determining gestures and distinguishing between left and right hands.

From Purdue University News 
View Full Article

Abstracts Copyright © 2012 Information Inc., Bethesda, Maryland, USA 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account