TY - JOUR
T1 - Head movements, facial expressions and feedback in conversations - Empirical evidence from Danish multimodal data
AU - Paggio, Patrizia
AU - Navarretta, Costanza
PY - 2013/3
Y1 - 2013/3
N2 - This article deals with multimodal feedback in two Danish multimodal corpora, i. e., a collection of map-task dialogues and a corpus of free conversations in first encounters between pairs of subjects. Machine learning techniques are applied to both sets of data to investigate various relations between the non-verbal behaviour-more specifically head movements and facial expressions-and speech with regard to the expression of feedback. In the map-task data, we study the extent to which the dialogue act type of linguistic feedback expressions can be classified automatically based on the non-verbal features. In the conversational data, on the other hand, non-verbal and speech features are used together to distinguish feedback from other multimodal behaviours. The results of the two sets of experiments indicate in general that head movements, and to a lesser extent facial expressions, are important indicators of feedback, and that gestures and speech disambiguate each other in the machine learning process.
AB - This article deals with multimodal feedback in two Danish multimodal corpora, i. e., a collection of map-task dialogues and a corpus of free conversations in first encounters between pairs of subjects. Machine learning techniques are applied to both sets of data to investigate various relations between the non-verbal behaviour-more specifically head movements and facial expressions-and speech with regard to the expression of feedback. In the map-task data, we study the extent to which the dialogue act type of linguistic feedback expressions can be classified automatically based on the non-verbal features. In the conversational data, on the other hand, non-verbal and speech features are used together to distinguish feedback from other multimodal behaviours. The results of the two sets of experiments indicate in general that head movements, and to a lesser extent facial expressions, are important indicators of feedback, and that gestures and speech disambiguate each other in the machine learning process.
U2 - 10.1007/s12193-012-0105-9
DO - 10.1007/s12193-012-0105-9
M3 - Journal article
SN - 1783-7677
VL - 7
SP - 29
EP - 37
JO - Journal on Multimodal User Interfaces
JF - Journal on Multimodal User Interfaces
IS - 1-2
ER -