Training RBMs based on the signs of the CD approximation of the log-likelihood derivatives

Asja Fischer, Christian Igel

1 Citation (Scopus)

Abstract

Contrastive Divergence (CD) learning is frequently applied to Restricted Boltzmann Machines (RBMs), the building blocks of deep believe networks. It relies on biased approximations of the log-likelihood gradient. This bias can deteriorate the learning process. It was claimed that the signs of most components of the CD update are equal to the corresponding signs of the log-likelihood gradient. This suggests using optimization techniques only depending on the signs. Resilient backpropagation is such a method and we combine it with CD learning. However, it does not prevent divergence caused by the approximation bias.

Original languageEnglish
Title of host publication19th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2011)
EditorsM. Verleysen
Number of pages6
PublisherESANN
Publication date2011
Pages495-500
ISBN (Print)978-2-87419-044-5
ISBN (Electronic)978-2-87419-057-5
Publication statusPublished - 2011
Event19th European Symposium On Artificial Neural Networks, Computational Intelligence and Machine Learning - Bruges, Belgium
Duration: 27 Apr 201127 Apr 2011
Conference number: 19

Conference

Conference19th European Symposium On Artificial Neural Networks, Computational Intelligence and Machine Learning
Number19
Country/TerritoryBelgium
CityBruges
Period27/04/201127/04/2011

Fingerprint

Dive into the research topics of 'Training RBMs based on the signs of the CD approximation of the log-likelihood derivatives'. Together they form a unique fingerprint.

Cite this