Abstract
Long Short-Term Memory (LSTM) is one of the best recent supervised sequence learning methods. Using gradient descent, it trains memory cells represented as differentiable computational graph structures. Interestingly, LSTM's cell structure seems somewhat arbitrary. In this paper we optimize its computational structure using a multi-objective evolutionary algorithm. The fitness function reflects the structure's usefulness for learning various formal languages. The evolved cells help to understand crucial features that aid sequence learning. © 2009 Springer Berlin Heidelberg.
Original language | English (US) |
---|---|
Title of host publication | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
Pages | 755-764 |
Number of pages | 10 |
DOIs | |
State | Published - Nov 27 2009 |
Externally published | Yes |
ASJC Scopus subject areas
- Theoretical Computer Science
- General Computer Science