Credit: CC0 Public Domain

When it comes to reading emotions on people's faces, artificial intelligence still lags behind human observers, according to a new study involving UCL.

The difference was particularly pronounced when it came to spontaneous displays of emotion, according to the findings published in PLOS One.

The research team, led by Dublin City University, looked at eight "out of the box" automatic classifiers for facial affect recognition ( that can identify human emotions on faces) and compared their emotion recognition performance to that of human observers.

The researchers found found that the human recognition accuracy of emotions was 72% whereas among the artificial intelligence tested, the researchers observed a variance in recognition accuracy, ranging from 48% to 62%.

Lead author Dr. Damien Dupré (Dublin City University) said: "AI systems claiming to recognize humans' emotions from their are now very easy to develop. However, most of them are based on inconclusive scientific evidence that people are expressing emotions in the same way.

"For these systems, human emotions come down to only six basic emotions, but they do not cope well with blended emotions.

"Companies using such systems need to be aware that the results obtained are not a measure of the emotion felt, but merely a measure of how much one's face matches with a face supposed to correspond to one of these six emotions."

The study involved 937 videos sampled from two large databases that conveyed the basic six emotions (happiness, sadness, anger, fear, surprise, and disgust). Two well-known dynamic facial expression databases were chosen: BU-4DFE from Binghamton University in New York and the other from The University of Texas in Dallas. Both are annotated in terms of emotion categories, and contain either posed or spontaneous facial expressions. All of the examined expressions were dynamic to reflect the realistic nature of human facial behavior.

Classification accuracy for AI was consistently lower for spontaneous affective behavior, but the gap narrowed for posed expressions. The two best AI systems were similarly adept to people at identifying posed expressions.

To evaluate the accuracy of emotion recognition, the study compared the performance achieved by human judges with those of eight commercially available automatic classifiers.

Co-author Dr. Eva Krumhuber (UCL Psychology & Language Sciences) added: "AI has come a long way in identifying people's facial expressions, but our research suggests that there is still room for improvement in recognizing genuine human emotions."

The PLOS One study was conducted by researchers at Dublin City University, UCL, University of Bremen and Queen's University Belfast.

Dr. Krumhuber recently led a separate study, published in Emotion comparing human and machines in emotion recognition across fourteen different databases of dynamic facial expressions. The smaller study, which used a different method to analyze the machine data, found that AI was comparable to humans at recognizing emotions.

More information: Damien Dupré et al. A performance comparison of eight commercially available automatic classifiers for facial affect recognition, PLOS ONE (2020). DOI: 10.1371/journal.pone.0231968

Journal information: PLoS ONE , Emotion