Massively-parallel best subset selection for ordinary least-squares regression

Fabian Gieseke, Kai Lars Polsterer, Ashish Mahabal, Christian Igel, Tom Heskes

2 Citationer (Scopus)

Abstract

Selecting an optimal subset of k out of d features for linear regression models given n training instances is often considered intractable for feature spaces with hundreds or thousands of dimensions. We propose an efficient massively-parallel implementation for selecting such optimal feature subsets in a brute-force fashion for small k. By exploiting the enormous compute power provided by modern parallel devices such as graphics processing units, it can deal with thousands of input dimensions even using standard commodity hardware only. We evaluate the practical runtime using artificial datasets and sketch the applicability of our framework in the context of astronomy.
OriginalsprogEngelsk
Titel2017 IEEE Symposium Series on Computational Intelligence (SSCI) Proceedings
Antal sider8
ForlagIEEE
Publikationsdato1 jul. 2017
Sider1-8
ISBN (Elektronisk)978-1-5386-2726-6
DOI
StatusUdgivet - 1 jul. 2017
Begivenhed2017 IEEE Symposium Series on Computational Intelligence (SSCI) - Honolulu, USA
Varighed: 27 nov. 20171 dec. 2017

Konference

Konference2017 IEEE Symposium Series on Computational Intelligence (SSCI)
Land/OmrådeUSA
ByHonolulu
Periode27/11/201701/12/2017

Fingeraftryk

Dyk ned i forskningsemnerne om 'Massively-parallel best subset selection for ordinary least-squares regression'. Sammen danner de et unikt fingeraftryk.

Citationsformater