A Novel Model for Emotion Detection from Facial Muscles Activity
Paper i proceeding, 2020

Considering human’s emotion in different applications and systems has received substantial attention over the last three decades. The traditional approach for emotion detection is to first extract different features and then apply a classifier, like SVM, to find the true class. However, recently proposed Deep Learning based models outperform traditional machine learning approaches without requirement of a separate feature extraction phase. This paper proposes a novel deep learning based facial emotion detection model, which uses facial muscles activities as raw input to recognize the type of the expressed emotion in the real time. To this end, we first use OpenFace to extract the activation values of the facial muscles, which are then presented to a Stacked Auto Encoder (SAE) as feature set. Afterward, the SAE returns the best combination of muscles in describing a particular emotion, these extracted features at the end are applied to a Softmax layer in order to fulfill multi classification task. The proposed model has been applied to the CK+, MMI and RADVESS datasets and achieved respectively average accuracies of 95.63%, 95.58%, and 84.91% for emotion type detection in six classes, which outperforms state-of-the-art algorithms.

Stacked Auto Encoder

Facial Muscles Activity

Facial Action Units

Facial Emotion Recognition

Författare

Elahe Bagheri

Vrije Universiteit Brüssel (VUB)

Azam Bagheri

Chalmers, Elektroteknik, Elkraftteknik, Elnät och komponenter

Pablo G. Esteban

Vrije Universiteit Brüssel (VUB)

Bram Vanderborgth

Vrije Universiteit Brüssel (VUB)

Advances in Intelligent Systems and Computing

21945357 (ISSN)

Vol. 1093 237-249

Robot 2019: Fourth Iberian Robotics Conference
Porto, Portugal,

Ämneskategorier

Annan data- och informationsvetenskap

Bioinformatik (beräkningsbiologi)

Datorseende och robotik (autonoma system)

DOI

10.1007/978-3-030-36150-1_20

Mer information

Senast uppdaterat

2020-12-23