Training RBMs based on the signs of the CD approximation of the log-likelihood derivatives

Asja Fischer, Christian Igel

1 Citationer (Scopus)

Abstract

Contrastive Divergence (CD) learning is frequently applied to Restricted Boltzmann Machines (RBMs), the building blocks of deep believe networks. It relies on biased approximations of the log-likelihood gradient. This bias can deteriorate the learning process. It was claimed that the signs of most components of the CD update are equal to the corresponding signs of the log-likelihood gradient. This suggests using optimization techniques only depending on the signs. Resilient backpropagation is such a method and we combine it with CD learning. However, it does not prevent divergence caused by the approximation bias.

OriginalsprogEngelsk
Titel19th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2011)
RedaktørerM. Verleysen
Antal sider6
ForlagESANN
Publikationsdato2011
Sider495-500
ISBN (Trykt)978-2-87419-044-5
ISBN (Elektronisk)978-2-87419-057-5
StatusUdgivet - 2011
Begivenhed19th European Symposium On Artificial Neural Networks, Computational Intelligence and Machine Learning - Bruges, Belgien
Varighed: 27 apr. 201127 apr. 2011
Konferencens nummer: 19

Konference

Konference19th European Symposium On Artificial Neural Networks, Computational Intelligence and Machine Learning
Nummer19
Land/OmrådeBelgien
ByBruges
Periode27/04/201127/04/2011

Fingeraftryk

Dyk ned i forskningsemnerne om 'Training RBMs based on the signs of the CD approximation of the log-likelihood derivatives'. Sammen danner de et unikt fingeraftryk.

Citationsformater