Abstract
Contrastive Divergence (CD) learning is frequently applied to Restricted Boltzmann Machines (RBMs), the building blocks of deep believe networks. It relies on biased approximations of the log-likelihood gradient. This bias can deteriorate the learning process. It was claimed that the signs of most components of the CD update are equal to the corresponding signs of the log-likelihood gradient. This suggests using optimization techniques only depending on the signs. Resilient backpropagation is such a method and we combine it with CD learning. However, it does not prevent divergence caused by the approximation bias.
Originalsprog | Engelsk |
---|---|
Titel | 19th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2011) |
Redaktører | M. Verleysen |
Antal sider | 6 |
Forlag | ESANN |
Publikationsdato | 2011 |
Sider | 495-500 |
ISBN (Trykt) | 978-2-87419-044-5 |
ISBN (Elektronisk) | 978-2-87419-057-5 |
Status | Udgivet - 2011 |
Begivenhed | 19th European Symposium On Artificial Neural Networks, Computational Intelligence and Machine Learning - Bruges, Belgien Varighed: 27 apr. 2011 → 27 apr. 2011 Konferencens nummer: 19 |
Konference
Konference | 19th European Symposium On Artificial Neural Networks, Computational Intelligence and Machine Learning |
---|---|
Nummer | 19 |
Land/Område | Belgien |
By | Bruges |
Periode | 27/04/2011 → 27/04/2011 |