Learning to forget: Continual prediction with LSTM

Felix A. Gers, Jürgen Schmidhuber, Fred Cummins

Research output: Contribution to journalArticlepeer-review

3873 Scopus citations

Abstract

Long short-term memory (LSTM; Hochreiter & Schmidhuber, 1997) can solve numerous tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams that are not a priori segmented into subsequences with explicitly marked ends at which the network's internal state could be reset. Without resets, the state may grow indefinitely and eventually cause the network to break down. Our remedy is a novel, adaptive "forget gate" that enables an LSTM cell to learn to reset itself at appropriate times, thus releasing internal resources. We review illustrative benchmark problems on which standard LSTM outperforms other RNN algorithms. All algorithms (including LSTM) fail to solve continual versions of these problems. LSTM with forget gates, however, easily solves them, and in an elegant way.
Original languageEnglish (US)
Pages (from-to)2451-2471
Number of pages21
JournalNeural Computation
Volume12
Issue number10
DOIs
StatePublished - Jan 1 2000
Externally publishedYes

ASJC Scopus subject areas

  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'Learning to forget: Continual prediction with LSTM'. Together they form a unique fingerprint.

Cite this