Bounding the bias of contrastive divergence learning

Anja Fischer, Christian Igel

34 Citations (Scopus)

Abstract

Optimization based on k-step contrastive divergence (CD) has become a common way to train restricted Boltzmann machines (RBMs). The k-step CD is a biased estimator of the log-likelihood gradient relying on Gibbs sampling. We derive a new upper bound for this bias. Its magnitude depends on k, the number of variables in the RBM, and the maximum change in energy that can be produced by changing a single variable. The last reflects the dependence on the absolute values of the RBM parameters. The magnitude of the bias is also affected by the distance in variation between the modeled distribution and the starting distribution of the Gibbs chain.
Original languageEnglish
JournalNeural Computation
Volume23
Issue number3
Pages (from-to)664-673
Number of pages10
ISSN0899-7667
DOIs
Publication statusPublished - Mar 2011

Fingerprint

Dive into the research topics of 'Bounding the bias of contrastive divergence learning'. Together they form a unique fingerprint.

Cite this