Efficient lifelong learning with A-GEM

Arslan Chaudhry, Ranzato Marc'Aurelio, Marcus Rohrbach, Mohamed Elhoseiny

Research output: Chapter in Book/Report/Conference proceedingConference contribution

452 Scopus citations

Abstract

In lifelong learning, the learner is presented with a sequence of tasks, incrementally building a data-driven prior which may be leveraged to speed up learning of a new task. In this work, we investigate the efficiency of current lifelong approaches, in terms of sample complexity, computational and memory cost. Towards this end, we first introduce a new and a more realistic evaluation protocol, whereby learners observe each example only once and hyper-parameter selection is done on a small and disjoint set of tasks, which is not used for the actual learning experience and evaluation. Second, we introduce a new metric measuring how quickly a learner acquires a new skill. Third, we propose an improved version of GEM (Lopez-Paz & Ranzato, 2017), dubbed Averaged GEM (A-GEM), which enjoys the same or even better performance as GEM, while being almost as computationally and memory efficient as EWC (Kirkpatrick et al., 2016) and other regularization-based methods. Finally, we show that all algorithms including A-GEM can learn even more quickly if they are provided with task descriptors specifying the classification tasks under consideration. Our experiments on several standard lifelong learning benchmarks demonstrate that A-GEM has the best trade-off between accuracy and efficiency.1
Original languageEnglish (US)
Title of host publication7th International Conference on Learning Representations, ICLR 2019
PublisherInternational Conference on Learning Representations, ICLR
StatePublished - Jan 1 2019
Externally publishedYes

Fingerprint

Dive into the research topics of 'Efficient lifelong learning with A-GEM'. Together they form a unique fingerprint.

Cite this