acm-header
Sign In

Communications of the ACM

ACM TechNews

The Journey from Jar Jar to Sign Language Motion Capture Opens the Door to a New Way to Communicate


View as: Print Mobile App Share:

Dalhousie University professor Aaron Newman, the Canada Research Chair in Cognitive Neuroscience, is using motion capture technology to develop a more thorough understanding of sign language and other forms of gesture-based communication. A major challenge in studying the effects of sign language on the brain is that human gestures are frequently aided or affected by other stimuli such as facial expressions. To eliminate these other indicators, Newman prepared short videos that removed everything but the most basic movements. The videos were shown to study participants connected to an EEG system that monitors brain activity.

"We can see instantly how people react, within milliseconds," he says. "We want to know where and when symbolic communication crosses the threshold into full-blown language in the brain."

To achieve a high level of clarity in the videos, Newman connected student volunteers to dozens of fiber-optic sensors and recorded their movement. Overall, 20 students were used to make more than 80 short video clips using the motion capture data. During the next few months, Newman will test the videos on both sign language users and people who do not know sign language, and he is already creating a second set of motion capture data that he plans to turn into another set of animations.

From Newswise
View Full Article


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account