FER-Pep: A Deep Learning Based Facial Emotion Recognition Framework for Humanoid Robot Pepper

TU Ahmed, D Mishra - International Conference on Human-Computer …, 2024 - Springer
International Conference on Human-Computer Interaction, 2024Springer
The ability to equip robots with social skills in terms of making human-robot interaction more
natural, authentic, and lifelike is a challenging task in the domain of human-robot
communication. A key component in doing this is the robot's aptitude to perceive and
understand human emotional states. In the larger domains of human-machine interaction
and affective computing, emotion detection has received a lot of attention. In this research,
an improved facial expression recognition framework is developed for the humanoid robot …
Abstract
The ability to equip robots with social skills in terms of making human-robot interaction more natural, authentic, and lifelike is a challenging task in the domain of human-robot communication. A key component in doing this is the robot’s aptitude to perceive and understand human emotional states. In the larger domains of human-machine interaction and affective computing, emotion detection has received a lot of attention. In this research, an improved facial expression recognition framework is developed for the humanoid robot Pepper that allows Pepper to recognize human facial emotions beyond seven basic expressions. Three unique facial expressions mockery, think and wink are introduced along with seven basic expressions anger, disgust, happy, neutral, fear, sad and surprise. Several deep learning models, transformer: MobileNetV2, Residual attention network, Vision transformer (ViT) and EfficientNetV2 are assigned to this Facial Emotion Recognition (FER) task during the experiment. EfficientNetV2 is proved to be more robust in FER outperforming other candidate models achieving validation accuracy, recall and F1 score of 88.23%, 88.61% and 88.19% respectively.
Springer
Showing the best result for this search. See all results