Feedback facial expressions and emotions

1 Citation (Scopus)

Abstract

The paper investigates the relation between emotions and feedback facial expressions in video and audio recorded Danish dyadic first encounters. In particular, we train a classifier on the manual annotations of the corpus in order to investigate to which extent the encoding of emotions contribute to the prediction of the feedback functions of facial expressions. This work builds upon and extends previous research on (a) the annotation and analysis of emotions in the corpus in which it was suggested that emotions are related to specific communicative functions, and (b) the prediction of feedback head movements using multimodal information. The results of the experiments show that information on multimodal behaviours which co-occur with the facial expressions improve the classifier performance. The improvement of the F-measure with respect to the unimodal baseline is of 0.269 and this result is parallel to that obtained for head movements in the same corpus. The experiments also show that the annotations of emotions contribute further to the prediction of feedback facial expressions confirming their relation. The best results are obtained training the classifier on the shape of facial expressions and co-occurring head movements, emotion labels, the gesturer’s and the interlocutor’s speech and can be used in applied systems.
Original languageEnglish
JournalJournal on Multimodal User Interfaces
Volume8
Pages (from-to)135
Number of pages141
ISSN1783-7677
DOIs
Publication statusPublished - Jun 2014

Fingerprint

Dive into the research topics of 'Feedback facial expressions and emotions'. Together they form a unique fingerprint.

Cite this