Automatic systems that analyze gestures and facial expressions might improve the challenging task of diagnosing depression.
One such system, SimSensei, is a digital avatar that interviews people to determine their state of mind using facial-recognition technology and depth-sensing cameras integrated with Microsoft Kinect to capture and analyze body language. University of Southern California researchers identified characteristic movements that signal possible depression by interviewing non-depressed volunteers and those who had been diagnosed with depression or post-traumatic stress disorder. Focusing a high-definition webcam on a subject's face and tracking body movements with Kinect, the team noted that depressed people are more likely to fidget and drop their gaze, and smile less than average.
Another automatic depression diagnosis system is underway at the University of Canberra, which is working with the Black Dog Institute to develop a machine-vision system that looks for distinctive facial expressions, slower-than-usual blinking, and specific upper-body movements.
Meanwhile, University of Pittsburgh researchers are studying changes in facial expression as a person receives depression treatment. These new systems will be tested in October, when global researchers gather at the ACM Multimedia conference in Barcelona, Spain, to participate in a contest to discover the most accurate depression diagnosis system.
From New Scientist
View Full Article - May Require Free Registration
Abstracts Copyright © 2013 Information Inc., Bethesda, Maryland, USA
No entries found