Head movements, facial expressions and feedback in conversations - Empirical evidence from Danish multimodal data

8 Citationer (Scopus)

Abstract

This article deals with multimodal feedback in two Danish multimodal corpora, i. e., a collection of map-task dialogues and a corpus of free conversations in first encounters between pairs of subjects. Machine learning techniques are applied to both sets of data to investigate various relations between the non-verbal behaviour-more specifically head movements and facial expressions-and speech with regard to the expression of feedback. In the map-task data, we study the extent to which the dialogue act type of linguistic feedback expressions can be classified automatically based on the non-verbal features. In the conversational data, on the other hand, non-verbal and speech features are used together to distinguish feedback from other multimodal behaviours. The results of the two sets of experiments indicate in general that head movements, and to a lesser extent facial expressions, are important indicators of feedback, and that gestures and speech disambiguate each other in the machine learning process.

OriginalsprogEngelsk
TidsskriftJournal on Multimodal User Interfaces
Vol/bind7
Udgave nummer1-2
Sider (fra-til)29-37
ISSN1783-7677
DOI
StatusUdgivet - mar. 2013

Fingeraftryk

Dyk ned i forskningsemnerne om 'Head movements, facial expressions and feedback in conversations - Empirical evidence from Danish multimodal data'. Sammen danner de et unikt fingeraftryk.

Citationsformater