Neural Differential Equations for Learning to Program Neural Nets Through Continuous Learning Rules

Kazuki Irie, Francesco Faccio, Juergen Schmidhuber

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Neural ordinary differential equations (ODEs) have attracted much attention as continuous-time counterparts of deep residual neural networks (NNs), and numerous extensions for recurrent NNs have been proposed. Since the 1980s, ODEs have also been used to derive theoretical results for NN learning rules, e.g., the famous connection between Oja's rule and principal component analysis. Such rules are typically expressed as additive iterative update processes which have straightforward ODE counterparts. Here we introduce a novel combination of learning rules and Neural ODEs to build continuous-time sequence processing nets that learn to manipulate short-term memory in rapidly changing synaptic connections of other nets. This yields continuous-time counterparts of Fast Weight Programmers and linear Transformers. Our novel models outperform the best existing Neural Controlled Differential Equation based models on various time series classification tasks, while also addressing their fundamental scalability limitations. Our code is public.
Original languageEnglish (US)
Title of host publication36th Conference on Neural Information Processing Systems (NeurIPS 2022).
PublisherarXiv
StatePublished - Oct 14 2022

Fingerprint

Dive into the research topics of 'Neural Differential Equations for Learning to Program Neural Nets Through Continuous Learning Rules'. Together they form a unique fingerprint.

Cite this