TY - GEN
T1 - Modeling systems with internal state using evolino
AU - Wierstra, Daan
AU - Gomez, Faustino J.
AU - Schmidhuber, Jürgen
N1 - Generated from Scopus record by KAUST IRTS on 2022-09-14
PY - 2005/12/1
Y1 - 2005/12/1
N2 - Existing Recurrent Neural Networks (RNNs) are limited in their ability to model dynamical systems with nonlinearities and hidden internal states. Here we use our general framework for sequence learning, EVOlution of recurrent systems with LINear Outputs (Evolino), to discover good RNN hidden node weights through evolution, while using linear regression to compute an optimal linear mapping from hidden state to output. Using the Long Short-Term Memory RNN Architecture, Evolino outperforms previous state-of-the-art methods on several tasks: 1) context-sensitive languages, 2) multiple superimposed sine waves. Copyright 2005 ACM.
AB - Existing Recurrent Neural Networks (RNNs) are limited in their ability to model dynamical systems with nonlinearities and hidden internal states. Here we use our general framework for sequence learning, EVOlution of recurrent systems with LINear Outputs (Evolino), to discover good RNN hidden node weights through evolution, while using linear regression to compute an optimal linear mapping from hidden state to output. Using the Long Short-Term Memory RNN Architecture, Evolino outperforms previous state-of-the-art methods on several tasks: 1) context-sensitive languages, 2) multiple superimposed sine waves. Copyright 2005 ACM.
UR - http://portal.acm.org/citation.cfm?doid=1068009.1068315
UR - http://www.scopus.com/inward/record.url?scp=32444434467&partnerID=8YFLogxK
U2 - 10.1145/1068009.1068315
DO - 10.1145/1068009.1068315
M3 - Conference contribution
SN - 1595930108
SP - 1795
EP - 1802
BT - GECCO 2005 - Genetic and Evolutionary Computation Conference
ER -