emotions
Credit: Unsplash/CC0 Public Domain

New research introduces a method to improve the accuracy and speed of dynamic emotion recognition using a convolutional neural network (CNN) to analyze faces. The work undertaken by Lanbo Xu of Northeastern University in Shenyang, China, could have applications for mental health, human-computer interaction, security, and other areas.

The work is published in the International Journal of Biometrics.

Facial expressions are a major part of non-verbal communication, providing clues about an individual's . Until now, emotion recognition systems have used static images, which means they cannot capture the changing nature of emotions as they play out over a person's face during a conversation, interview or other interaction. Xu's work addresses this by focusing on video sequences. The system can track changing over a series of video frames and then offer a detailed analysis of how a person's emotions unfold in real time.

However, prior to analysis, the system applies an algorithm, the "chaotic frog leap algorithm," to sharpen key facial features. The algorithm mimics the foraging behavior of frogs to find optimal parameters in the digital images. The CNN trained on a dataset of human expressions is the most important part of the approach, allowing Xu to process by recognizing patterns in new images that intersect with the training data. By analyzing several frames from , the system can capture movements of the mouth, eyes, and eyebrows, which are often subtle but important indicators of emotional changes.

Xu reports an accuracy of up to 99%, with the system providing an output within a fraction of a second. Such precision and speed is ideal for real-time use in various areas where detecting emotion might be useful without the need for subjective assessment by another person or team. Its potential applications lie in improving user experiences with computer interactions where the computer can respond appropriately to the user's emotional state, such as frustration, anger, or boredom.

The system might be useful in screening people for emotional disorders without initial human intervention. It could also be used in enhancing , allowing access to resources but only to those in a particular emotional state and barring entry to an angry or upset person, perhaps. The same system could even be used to identify driver fatigue on transport systems or even in one's own vehicle. The entertainment and marketing sectors might also see applications where understanding emotional responses could improve content development, delivery, and consumer engagement.

More information: Lanbo Xu, Dynamic emotion recognition of human face based on convolutional neural network, International Journal of Biometrics (2024). DOI: 10.1504/IJBM.2024.140785

Citation: Algorithm sharpens facial features for better emotion detection (2024, September 5) retrieved 5 September 2024 from https://techxplore.com/news/2024-09-algorithm-sharpens-facial-features-emotion.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.