Computers are able to read a person's body language to tell whether they are bored or interested in what they see on a screen, according to a new study led by body-language expert Harry Witchel, Discipline Leader in Physiology at Brighton and Sussex Medical School.
The research shows that by measuring a person's movements as they use a computer, it is possible to judge their level of interest by monitoring whether they display the tiny movements that people usually constantly exhibit, known as non-instrumental movements.
If someone is absorbed in what they are watching or doing — what Witchel calls "rapt engagement" — there is a decrease in these involuntary movements.
"Our study showed that when someone is really highly engaged in what they're doing, they suppress these tiny involuntary movements," Witchel says. "It's the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle."
The discovery could have a significant impact on the development of artificial intelligence. Future applications could include the creation of online tutoring programs that adapt to a person's level of interest, in order to re-engage them if they are showing signs of boredom. It could even help in the development of companion robots, which would be better able to estimate a person's state of mind.
Also, for experienced designers such as movie directors or game makers, this technology could provide complementary moment-by-moment reading of whether the events on the screen are interesting. While viewers can be asked subjectively what they liked or disliked, a non-verbal technology would be able to detect emotions or mental states that people either forget or prefer not to mention.
"Being able to 'read' a person's interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process," Witchel says. "Further ahead it could help us create more empathetic companion robots, which may sound very 'sci-fi' but are becoming a realistic possibility within our lifetimes."
In the study, 27 participants faced a range of three-minute stimuli on a computer, from fascinating games to tedious readings from EU banking regulation, while using a handheld trackball to minimize instrumental movements, such as moving the mouse. Their movements were quantified over the three minutes using video motion tracking. In two comparable reading tasks, the more engaging reading resulted in a significant reduction (42%) of non-instrumental movement.
The study, "Non-Instrumental Movement Inhibition (NIMI) Differentially Suppresses Head and Thigh Movements during Screenic Engagement: Dependence on Interaction," is published in the journal Frontiers in Psychology.
The research team also included two members of Witchel's team, Carlos Santos and James Ackah, media expert Carina Westling from the University of Sussex, and the clinical biomechanics group at Staffordshire University led by Professor Nachiappan Chockalingam.
No entries found