Predicting an Individual’s Gestures from the Interlocutor’s Co-occurring Gestures and Related Speech

Abstract

Overlapping speech and gestures are common in
face-to-face conversations and have been interpreted as a sign
of synchronization between conversation participants. A number
of gestures are even mirrored or mimicked. Therefore, we
hypothesize that the gestures of a subject can contribute to the
prediction of gestures of the same type of the other subject.
In this work, we also want to determine whether the speech
segments to which these gestures are related to contribute to
the prediction. The results of our pilot experiments show that a
Naive Bayes classifier trained on the duration and shape features
of head movements and facial expressions contributes to the
identification of the presence and shape of head movements
and facial expressions respectively. Speech only contributes to
prediction in the case of facial expressions. The obtained results
show that the gestures of the interlocutors are one of the
numerous factors to be accounted for when modeling gesture
production in conversational interactions and this is relevant to
the development of socio-cognitive ICT.
Original languageEnglish
Title of host publicationProceedings of the IEEE 7th International Conference on Cognitive Infocommunications
Number of pages5
PublisherIEEE Signal Processing Society
Publication date2016
Pages233-237
ISBN (Print)978-1-5090-2644-9
Publication statusPublished - 2016

Fingerprint

Dive into the research topics of 'Predicting an Individual’s Gestures from the Interlocutor’s Co-occurring Gestures and Related Speech'. Together they form a unique fingerprint.

Cite this