Learning to forget: Continual prediction with LSTM

Felix A. Gers, Jurgen Schmidhuber, Fred Cummins

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1022 Scopus citations

Abstract

Long Short-Term Memory (LSTM) can solve many tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams without explicitly marked sequence ends. Without resets, the internal state values may grow indefinitely and eventually cause the network to break down. Our remedy is an adaptive `forget gate' that enables an LSTM cell to learn to reset itself at appropriate times, thus releasing internal resources. We review an illustrative benchmark problem on which standard LSTM outperforms other RNN algorithms. All algorithms (including LSTM) fail to solve a continual version of that problem. LSTM with forget gates, however, easily solves it in an elegant way.
Original languageEnglish (US)
Title of host publicationIEE Conference Publication
PublisherIEEStevenage
Pages850-855
Number of pages6
ISBN (Print)0852967217
DOIs
StatePublished - Jan 1 1999
Externally publishedYes

Fingerprint

Dive into the research topics of 'Learning to forget: Continual prediction with LSTM'. Together they form a unique fingerprint.

Cite this