The researchers mapped 49 landmarks on the face, using these to assess how the face changes as we smile. Credit: Hassan Ugail

The dynamics of how men and women smile differs measurably, according to new research, enabling artificial intelligence (AI) to automatically assign gender purely based on a smile.

Although automatic gender is already available, existing methods use static images and compare fixed . The new research, by the University of Bradford, is the first to use the dynamic movement of the smile to automatically distinguish between men and .

Led by Professor Hassan Ugail, the team mapped 49 landmarks on the face, mainly around the eyes, mouth and down the nose. They used these to assess how the face changes as we smile caused by the underlying muscle movements - including both changes in distances between the different points and the 'flow' of the smile: how much, how far and how fast the different points on the face moved as the smile was formed.

They then tested whether there were noticeable differences between men and women - and found that there were, with women's smiles being more expansive.

Lead researcher, Professor Hassan Ugail from the University of Bradford said: "Anecdotally, women are thought to be more expressive in how they smile, and our research has borne this out. Women definitely have broader smiles, expanding their mouth and lip area far more than men."

The team created an algorithm using their analysis and tested it against video footage of 109 people as they smiled. The computer was able to correctly determine gender in 86% of cases and the team believe the accuracy could easily be improved.

"We used a fairly simple machine classification for this research as we were just testing the concept, but more sophisticated AI would improve the recognition rates," said Professor Ugail.

The underlying purpose of this research is more about trying to enhance machine learning capabilities, but it has raised a number of intriguing questions that the team hopes to investigate in future projects.

One is how the machine might respond to the smile of a transgender person and the other is the impact of plastic surgery on recognition rates.

"Because this system measures the underlying muscle movement of the face during a , we believe these dynamics will remain the same even if external physical features change, following surgery for example," said Professor Ugail. "This kind of facial recognition could become a next- generation biometric, as it's not dependent on one feature, but on a dynamic that's unique to an individual and would be very difficult to mimic or alter."

The research is published in The Visual Computer: International Journal of Computer Graphics.

More information: Hassan Ugail et al, Is gender encoded in the smile? A computational framework for the analysis of the smile driven dynamic face for gender recognition, The Visual Computer (2018). DOI: 10.1007/s00371-018-1494-x