Head movements, facial expressions and feedback in conversations - Empirical evidence from Danish multimodal data

8 Citations (Scopus)

Abstract

This article deals with multimodal feedback in two Danish multimodal corpora, i. e., a collection of map-task dialogues and a corpus of free conversations in first encounters between pairs of subjects. Machine learning techniques are applied to both sets of data to investigate various relations between the non-verbal behaviour-more specifically head movements and facial expressions-and speech with regard to the expression of feedback. In the map-task data, we study the extent to which the dialogue act type of linguistic feedback expressions can be classified automatically based on the non-verbal features. In the conversational data, on the other hand, non-verbal and speech features are used together to distinguish feedback from other multimodal behaviours. The results of the two sets of experiments indicate in general that head movements, and to a lesser extent facial expressions, are important indicators of feedback, and that gestures and speech disambiguate each other in the machine learning process.

Original languageEnglish
JournalJournal on Multimodal User Interfaces
Volume7
Issue number1-2
Pages (from-to)29-37
ISSN1783-7677
DOIs
Publication statusPublished - Mar 2013

Fingerprint

Dive into the research topics of 'Head movements, facial expressions and feedback in conversations - Empirical evidence from Danish multimodal data'. Together they form a unique fingerprint.

Cite this