Abstract
Choosing the best hyperparameters for neural networks is a big challenge. This paper proposes a method that automatically initializes and adjusts hyperparameters during the training process of stacked autoencoders. A population of autoencoders is trained with gradient-descent-based weight updates, while hyperparameters are mutated and weights are inherited in a Lamarckian kind of way. The training is conducted layer-wise, while each new layer initiates a new neuroevolutionary optimization process. In the fitness function of the evolutionary approach a dimensionality reduction quality measure is employed. Experiments show the contribution of the most significant hyperparameters, while analyzing their lineage during the training process. The results confirm that the proposed method outperforms a baseline approach on MNIST, FashionMNIST, and the Year Prediction Million Song Database.
Originalsprog | Engelsk |
---|---|
Titel | 2019 IEEE Congress on Evolutionary Computation, CEC 2019 - Proceedings |
Antal sider | 8 |
Forlag | Institute of Electrical and Electronics Engineers Inc. |
Publikationsdato | jun. 2019 |
Sider | 823-830 |
Artikelnummer | 8790182 |
ISBN (Elektronisk) | 9781728121536 |
DOI | |
Status | Udgivet - jun. 2019 |
Begivenhed | 2019 IEEE Congress on Evolutionary Computation, CEC 2019 - Wellington, New Zealand Varighed: 10 jun. 2019 → 13 jun. 2019 |
Konference
Konference | 2019 IEEE Congress on Evolutionary Computation, CEC 2019 |
---|---|
Land/Område | New Zealand |
By | Wellington |
Periode | 10/06/2019 → 13/06/2019 |
Sponsor | et al., Facebook, IEEE, IEEE CIS, Tourism New Zealand, Victoria University of Wellington |