A Novel Model for Emotion Detection from Facial Muscles Activity
Paper in proceeding, 2020

Considering human’s emotion in different applications and systems has received substantial attention over the last three decades. The traditional approach for emotion detection is to first extract different features and then apply a classifier, like SVM, to find the true class. However, recently proposed Deep Learning based models outperform traditional machine learning approaches without requirement of a separate feature extraction phase. This paper proposes a novel deep learning based facial emotion detection model, which uses facial muscles activities as raw input to recognize the type of the expressed emotion in the real time. To this end, we first use OpenFace to extract the activation values of the facial muscles, which are then presented to a Stacked Auto Encoder (SAE) as feature set. Afterward, the SAE returns the best combination of muscles in describing a particular emotion, these extracted features at the end are applied to a Softmax layer in order to fulfill multi classification task. The proposed model has been applied to the CK+, MMI and RADVESS datasets and achieved respectively average accuracies of 95.63%, 95.58%, and 84.91% for emotion type detection in six classes, which outperforms state-of-the-art algorithms.

Stacked Auto Encoder

Facial Muscles Activity

Facial Action Units

Facial Emotion Recognition

Author

Elahe Bagheri

Vrije Universiteit Brussel (VUB)

Azam Bagheri

Chalmers, Electrical Engineering, Electric Power Engineering

Pablo G. Esteban

Vrije Universiteit Brussel (VUB)

Bram Vanderborgth

Vrije Universiteit Brussel (VUB)

Advances in Intelligent Systems and Computing

21945357 (ISSN) 2194-5365 (eISSN)

Vol. 1093 237-249
9783030361495 (ISBN)

Robot 2019: Fourth Iberian Robotics Conference
Porto, Portugal,

Subject Categories

Other Computer and Information Science

Bioinformatics (Computational Biology)

Computer Vision and Robotics (Autonomous Systems)

DOI

10.1007/978-3-030-36150-1_20

More information

Latest update

12/23/2020