Kalman filters improve LSTM network performance in problems unsolvable by traditional recurrent nets

Juan Antonio Pérez-Ortiz, Felix A. Gers, Douglas Eck, Jürgen U. Schmidhuber

Research output: Contribution to journalArticlepeer-review

75 Scopus citations

Abstract

The long short-term memory (LSTM) network trained by gradient descent solves difficult problems which traditional recurrent neural networks in general cannot. We have recently observed that the decoupled extended Kalman filter training algorithm allows for even better performance, reducing significantly the number of training steps when compared to the original gradient descent training algorithm. In this paper we present a set of experiments which are unsolvable by classical recurrent networks but which are solved elegantly and robustly and quickly by LSTM combined with Kalman filters. © 2003 Elsevier Science Ltd. All rights reserved.
Original languageEnglish (US)
Pages (from-to)241-250
Number of pages10
JournalNeural Networks
Volume16
Issue number2
DOIs
StatePublished - Mar 1 2003
Externally publishedYes

ASJC Scopus subject areas

  • Artificial Intelligence
  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'Kalman filters improve LSTM network performance in problems unsolvable by traditional recurrent nets'. Together they form a unique fingerprint.

Cite this