Neural processing of complex continual input streams

Felix A. Gers, Jurgen Schmidhuber

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Long Short-Term Memory (LSTM) can learn algorithms for temporal pattern processing not learnable by alternative recurrent neural networks (RNNs) or other methods such as Hidden Markov Models (HMMs) and symbolic grammar learning (SGL). Here we present tasks involving arithmetic operations on continual input streams that even LSTM cannot solve. But an LSTM variant based on `forget gates,' a recent extension, has superior arithmetic capabilities and does solve the tasks.
Original languageEnglish (US)
Title of host publicationProceedings of the International Joint Conference on Neural Networks
PublisherIEEEPiscataway, NJ, United States
Pages557-562
Number of pages6
StatePublished - Jan 1 2000
Externally publishedYes

Fingerprint

Dive into the research topics of 'Neural processing of complex continual input streams'. Together they form a unique fingerprint.

Cite this