Sequence classification with human attention

Maria Jung Barrett, Joachim Bingel, Nora Hollenstein, Marek Rei, Anders Søgaard

22 Citationer (Scopus)
1 Downloads (Pure)

Abstract

Learning attention functions requires large
volumes of data, but many NLP tasks simulate
human behavior, and in this paper, we
show that human attention really does provide
a good inductive bias on many attention
functions in NLP. Specifically, we use
estimated human attention derived from eyetracking
corpora to regularize attention functions
in recurrent neural networks. We show
substantial improvements across a range of
tasks, including sentiment analysis, grammatical
error detection, and detection of abusive
language.
OriginalsprogEngelsk
TitelProceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018)
RedaktørerAnna Korhonen , Ivan Titov
ForlagAssociation for Computational Linguistics
Publikationsdato2018
Sider302–312
ISBN (Trykt)978-1-948087-72-8
StatusUdgivet - 2018
Begivenhed22nd Conference on Computational Natural Language Learning (CoNLL 2018) - Brussels, Belgien
Varighed: 31 okt. 20181 nov. 2018

Konference

Konference22nd Conference on Computational Natural Language Learning (CoNLL 2018)
Land/OmrådeBelgien
ByBrussels
Periode31/10/201801/11/2018

Fingeraftryk

Dyk ned i forskningsemnerne om 'Sequence classification with human attention'. Sammen danner de et unikt fingeraftryk.

Citationsformater