Evolution of Stacked Autoencoders

Tim Silhan, Stefan Oehmcke*, Oliver Kramer

*Corresponding author af dette arbejde
    2 Citationer (Scopus)

    Abstract

    Choosing the best hyperparameters for neural networks is a big challenge. This paper proposes a method that automatically initializes and adjusts hyperparameters during the training process of stacked autoencoders. A population of autoencoders is trained with gradient-descent-based weight updates, while hyperparameters are mutated and weights are inherited in a Lamarckian kind of way. The training is conducted layer-wise, while each new layer initiates a new neuroevolutionary optimization process. In the fitness function of the evolutionary approach a dimensionality reduction quality measure is employed. Experiments show the contribution of the most significant hyperparameters, while analyzing their lineage during the training process. The results confirm that the proposed method outperforms a baseline approach on MNIST, FashionMNIST, and the Year Prediction Million Song Database.

    OriginalsprogEngelsk
    Titel2019 IEEE Congress on Evolutionary Computation, CEC 2019 - Proceedings
    Antal sider8
    ForlagInstitute of Electrical and Electronics Engineers Inc.
    Publikationsdatojun. 2019
    Sider823-830
    Artikelnummer8790182
    ISBN (Elektronisk)9781728121536
    DOI
    StatusUdgivet - jun. 2019
    Begivenhed2019 IEEE Congress on Evolutionary Computation, CEC 2019 - Wellington, New Zealand
    Varighed: 10 jun. 201913 jun. 2019

    Konference

    Konference2019 IEEE Congress on Evolutionary Computation, CEC 2019
    Land/OmrådeNew Zealand
    ByWellington
    Periode10/06/201913/06/2019
    Sponsoret al., Facebook, IEEE, IEEE CIS, Tourism New Zealand, Victoria University of Wellington

    Fingeraftryk

    Dyk ned i forskningsemnerne om 'Evolution of Stacked Autoencoders'. Sammen danner de et unikt fingeraftryk.

    Citationsformater