Abstract

We present a design pattern for composing deep learning networks in a typed, higher-order fashion. The exposed library functions are generically typed and the composition structure allows for networks to be trained (using backpropagation) and for trained networks to be used for predicting new results (using forward-propagation). Individual layers in a network can take different forms ranging over dense sigmoid layers to convolutional layers. The paper discusses different typing techniques aimed at enforcing proper use and composition of networks. The approach is implemented in Futhark, a data-parallel functional language and compiler targeting GPU architectures, and we demonstrate that Futhark's elimination of higher-order functions and modules leads to efficient generated code.

Original languageEnglish
Title of host publicationFHPNC 2019 - Proceedings of the 8th ACM SIGPLAN International Workshop on Functional High-Performance and Numerical Computing, co-located with ICFP 2019
EditorsMarco Zocca
Number of pages13
PublisherAssociation for Computing Machinery, Inc.
Publication date18 Aug 2019
Pages47-59
ISBN (Electronic)9781450368148
DOIs
Publication statusPublished - 18 Aug 2019
Event8th ACM SIGPLAN International Workshop on Functional High-Performance and Numerical Computing, FHPNC 2019, co-located with ICFP 2019 - Berlin, Germany
Duration: 18 Aug 2019 → …

Conference

Conference8th ACM SIGPLAN International Workshop on Functional High-Performance and Numerical Computing, FHPNC 2019, co-located with ICFP 2019
Country/TerritoryGermany
CityBerlin
Period18/08/2019 → …
SponsorACM SIGPLAN
SeriesFHPNC 2019 - Proceedings of the 8th ACM SIGPLAN International Workshop on Functional High-Performance and Numerical Computing, co-located with ICFP 2019

Keywords

  • Data-parallelism
  • Deep learning
  • Functional languages

Fingerprint

Dive into the research topics of 'Compositional deep learning in Futhark'. Together they form a unique fingerprint.

Cite this