Classification of Feedback Expressions in Multimodal Data

10 Citations (Scopus)

Abstract

This paper addresses the issue of how linguistic feedback expressions, prosody and head gestures, i.e. head movements and face expressions, relate to one another in a collection of eight video-recorded Danish map-task dialogues. The study shows that in these data, prosodic features and head gestures significantly improve automatic classification of dialogue act labels for linguistic expressions of feedback.

Original languageEnglish
Title of host publicationProceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Number of pages7
PublisherAssociation for Computational Linguistics
Publication date2010
Pages318-324
Publication statusPublished - 2010
EventAnnual Meeting of the Association for Computational Linguistics - Uppsala, Sweden
Duration: 11 Jul 201016 Aug 2010
Conference number: 48

Conference

ConferenceAnnual Meeting of the Association for Computational Linguistics
Number48
Country/TerritorySweden
CityUppsala
Period11/07/201016/08/2010

Fingerprint

Dive into the research topics of 'Classification of Feedback Expressions in Multimodal Data'. Together they form a unique fingerprint.

Cite this