Abstract
Multi-task learning (MTL) in deep neural networks for NLP has recently received increasing interest due to some compelling benefits, including its potential to efficiently regularize models and to reduce the need for labeled data. While it has brought significant improvements in a number of NLP tasks, mixed results have been reported, and little is known about the conditions under which MTL leads to gains in NLP. This paper sheds light on the specific task relations that can lead to gains from MTL models over single-task setups.
Original language | English |
---|---|
Title of host publication | Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics : volume 2, short papers |
Number of pages | 6 |
Volume | 2 |
Publisher | Association for Computational Linguistics |
Publication date | 2017 |
Pages | 164-169 |
ISBN (Electronic) | 9781510838604 |
Publication status | Published - 2017 |
Event | 15th Conference of the European Chapter of the Association for Computational Linguistics - Valencia, Spain Duration: 3 Apr 2017 → 7 Apr 2017 Conference number: 15 |
Conference
Conference | 15th Conference of the European Chapter of the Association for Computational Linguistics |
---|---|
Number | 15 |
Country/Territory | Spain |
City | Valencia |
Period | 03/04/2017 → 07/04/2017 |