Evolving memory cell structures for sequence learning

Justin Bayer, Daan Wierstra, Julian Togelius, Jürgen Schmidhuber

Research output: Chapter in Book/Report/Conference proceedingConference contribution

78 Scopus citations

Abstract

Long Short-Term Memory (LSTM) is one of the best recent supervised sequence learning methods. Using gradient descent, it trains memory cells represented as differentiable computational graph structures. Interestingly, LSTM's cell structure seems somewhat arbitrary. In this paper we optimize its computational structure using a multi-objective evolutionary algorithm. The fitness function reflects the structure's usefulness for learning various formal languages. The evolved cells help to understand crucial features that aid sequence learning. © 2009 Springer Berlin Heidelberg.
Original languageEnglish (US)
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Pages755-764
Number of pages10
DOIs
StatePublished - Nov 27 2009
Externally publishedYes

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Evolving memory cell structures for sequence learning'. Together they form a unique fingerprint.

Cite this