Abstract
Contrastive Divergence (CD) learning is frequently applied to Restricted Boltzmann Machines (RBMs), the building blocks of deep believe networks. It relies on biased approximations of the log-likelihood gradient. This bias can deteriorate the learning process. It was claimed that the signs of most components of the CD update are equal to the corresponding signs of the log-likelihood gradient. This suggests using optimization techniques only depending on the signs. Resilient backpropagation is such a method and we combine it with CD learning. However, it does not prevent divergence caused by the approximation bias.
Original language | English |
---|---|
Title of host publication | 19th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2011) |
Editors | M. Verleysen |
Number of pages | 6 |
Publisher | ESANN |
Publication date | 2011 |
Pages | 495-500 |
ISBN (Print) | 978-2-87419-044-5 |
ISBN (Electronic) | 978-2-87419-057-5 |
Publication status | Published - 2011 |
Event | 19th European Symposium On Artificial Neural Networks, Computational Intelligence and Machine Learning - Bruges, Belgium Duration: 27 Apr 2011 → 27 Apr 2011 Conference number: 19 |
Conference
Conference | 19th European Symposium On Artificial Neural Networks, Computational Intelligence and Machine Learning |
---|---|
Number | 19 |
Country/Territory | Belgium |
City | Bruges |
Period | 27/04/2011 → 27/04/2011 |