Abstract
How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over the uncertainty in a latent path, like a state space model, we improve the state of the art results on the Blizzard and TIMIT speech modeling data sets by a large margin, while achieving comparable performances to competing methods on polyphonic music modeling.
Originalsprog | Engelsk |
---|---|
Titel | Neural Information Processing Systems 2016 |
Redaktører | D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, R. Garnett |
Antal sider | 9 |
Forlag | Neural Information Processing Systems Foundation |
Publikationsdato | 2016 |
Sider | 2207-2215 |
Status | Udgivet - 2016 |
Begivenhed | 30th Annual Conference on Neural Information Processing Systems - Barcelona, Spanien Varighed: 5 dec. 2016 → 10 dec. 2016 Konferencens nummer: 30 |
Konference
Konference | 30th Annual Conference on Neural Information Processing Systems |
---|---|
Nummer | 30 |
Land/Område | Spanien |
By | Barcelona |
Periode | 05/12/2016 → 10/12/2016 |
Navn | Advances in Neural Information Processing Systems |
---|---|
Vol/bind | 29 |
ISSN | 1049-5258 |