Researchers at the U.K.'s University of Bradford say they have come up with a way to tell if your smile is real, or just a ruse. As it turns out, what you're doing with your eyes when you are smiling betrays whether or not you really mean it.
"We use two main sets of muscles when we smile: the zygomaticus major, which is responsible for the curling upwards of the mouth, and the orbicularis oculi, which causes crinkling around our eyes," explained Hassan Ugail, a professor of visual computing at the University of Bradford.
"In fake smiles, it is often only the mouth muscles which move," Ugail said. "But, as humans, we often don't spot the lack of movement around the eyes. The computer software can spot this much more reliably."
The researchers said they decided to study how a smile distributes itself across a face over time after their review of research literature revealed no other experiments with that focus.
In a sense, the researchers say their analysis of smiling is similar to the type of perception possessed by expert poker players, who are able to perceive tiny facial tics or 'microexpressions' that betray a person's true emotional state.
"Microexpressions are more spontaneous and subtle facial movements that happen involuntarily, thus revealing one's genuine, underlying emotion," explains Ron Kimmel, a computer science professor at Technion-Israel Institute of Technology who specializes in visual computing.
"The ability to automatically recognize facial expressions and infer the emotional state has a wide range of applications," says Kimmel. "These include emotionally and socipally aware systems, improved gaming experience, driver drowsiness detection," and detecting pain and distress.
The researchers say they began analyzing how a smile is expressed over time by taking a video of a smiling face and then identifying key landmarks on the face (around the eyes, cheeks and mouth) to study and compare. Identification of those key landmarks is accomplished using CHEHRA, an automated real-time face and eyes landmark detection and tracking software.
Once those landmarks have been identified, the researchers use an automated algorithm they developed to measure dynamic changes around the eyes, cheeks, and mouth as a smile expresses itself. The output of that algorithm enables the researchers to measure precisely how the face changes over time across the eyes, cheeks, and mouth, and also to compare how those changes differ when someone is smiling authentically or forcing a smile.
For baseline smiles data, the researchers used two control groups to compare the smiles they captured in their research against smiles already known to be either posed or authentic. Specifically, for posed smiles, the researchers used the publicly available Extended Cohn-Kanade Dataset (CK+), which features 82 subjects with posed smiles; attached to the images is metadata identifying which of the six basic emotions are shown in each image: happy, surprise, anger, fear, disgust, or sadness. For a known dataset of authentic smiles, the researchers used the publicly available Multimedia Understanding Group's MUG Facial Expression Dataset, which features 52 subjects of Caucasian origin.
Posed smiles were easy enough to coax from study subjects, while authentic smiles were elicited as they watched a video selected to trigger spontaneous smiling, according to Ugail.
What the researchers found was that real smiles generally require 10% more muscle movement around the eyes than posed smiles.
"Hence, to answer the question of which part of the facial feature contains the most information with regard to a genuine smile, our studies conclude it is indeed the eyes," says Ugail. "Our results not only confirm what already exists in the literature; i.e., that the spontaneous genuine smile is truly in the eyes, but it also gives further insight into the exact distribution of the smile across the face."
While the researchers stress that more study is needed before their results can be commercialized, they say their initial findings are promising. With more research, for example, they say a scientific tool could be developed that could be used by social scientists and clinical scientists to study the emotional states of individuals and groups.
They also the potential for their research being incorporated into biometric identity software, especially if the software is used to identify a specific person by comparing a video of that person against a database of many videos of that same person with posed and authentic smiles.
The researchers caution that further research is needed before their findings can be considered for commercialization. Experiments using larger datasets of smiles, both posed and authentic, need to be run through CHEHRA, and further experiments need to feature more smiles from people of various ethnic backgrounds, according to Ugail.
Terrance E. Boult, director of the Vision and Security Technology Lab at University of Colorado, Colorado Springs, takes issue with the research. Boult said the underlying thesis of the research "shows the experimental design has some fundamental problems, in that the 'true smiles' are from one dataset while the fake are from another, which means the differences could be, and in my view are more likely, caused by unintended differences in the datasets."
Joe Dysart is an Internet speaker and business consultant based in Manhattan, NY, USA.
No entries found