Abstract
This work aims to contribute to our understandingof when multi-task learning throughparameter sharing in deep neural networksleads to improvements over single-task learning.We focus on the setting of learning fromloosely related tasks, for which no theoreticalguarantees exist. We therefore approach thequestion empirically, studying which propertiesof datasets and single-task learning characteristicscorrelate with improvements frommulti-task learning. We are the first to studythis in a text classification setting and acrossmore than 500 different task pairs.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2018 EMNLP Workshop BlackboxNLP : Analyzing and Interpreting Neural Networks for NLP |
Publisher | Association for Computational Linguistics |
Publication date | 2018 |
Pages | 1-8 |
Publication status | Published - 2018 |
Event | 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP - Brussels, Belgium Duration: 1 Nov 2018 → 1 Nov 2018 |
Workshop
Workshop | 2018 EMNLP Workshop BlackboxNLP |
---|---|
Country/Territory | Belgium |
City | Brussels |
Period | 01/11/2018 → 01/11/2018 |