When does deep multi-task learning work for loosely related document classification tasks?

Emma Kerinec, Anders Søgaard, Chloé Braud

    Abstract

    This work aims to contribute to our understandingof when multi-task learning throughparameter sharing in deep neural networksleads to improvements over single-task learning.We focus on the setting of learning fromloosely related tasks, for which no theoreticalguarantees exist. We therefore approach thequestion empirically, studying which propertiesof datasets and single-task learning characteristicscorrelate with improvements frommulti-task learning. We are the first to studythis in a text classification setting and acrossmore than 500 different task pairs.
    Original languageEnglish
    Title of host publicationProceedings of the 2018 EMNLP Workshop BlackboxNLP : Analyzing and Interpreting Neural Networks for NLP
    PublisherAssociation for Computational Linguistics
    Publication date2018
    Pages1-8
    Publication statusPublished - 2018
    Event2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP - Brussels, Belgium
    Duration: 1 Nov 20181 Nov 2018

    Workshop

    Workshop2018 EMNLP Workshop BlackboxNLP
    Country/TerritoryBelgium
    CityBrussels
    Period01/11/201801/11/2018

    Cite this