TY - JOUR
T1 - Incremental slow feature analysis: Adaptive low-complexity slow feature updating from high-dimensional input streams
AU - Kompella, Varun Raj
AU - Luciw, Matthew
AU - Schmidhuber, Jürgen
N1 - Generated from Scopus record by KAUST IRTS on 2022-09-14
PY - 2012/1/1
Y1 - 2012/1/1
N2 - We introduce here an incremental version of slow feature analysis (IncSFA), combining candid covariance-free incremental principal components analysis (CCIPCA) and covariance-free incremental minor components analysis (CIMCA). IncSFA's feature updating complexity is linear with respect to the input dimensionality,while batch SFA's (BSFA) updating complexity is cubic. IncSFAdoes not need to store, or even compute, any covariance matrices. The drawback to IncSFA is data efficiency: it does not use each data point as effectively as BSFA. But IncSFA allows SFA to be tractably applied, with just a few parameters, directly on highdimensional input streams (e.g., visual input of an autonomous agent), while BSFA has to resort to hierarchical receptive-field-based architectures when the input dimension is too high. Further, IncSFA's updates have simple Hebbian and anti-Hebbian forms, extending the biologicalplausibility of SFA. Experimental results show IncSFA learns the same set of features as BSFA and can handle a few cases where BSFA fails. © 2012 Massachusetts Institute of Technology.
AB - We introduce here an incremental version of slow feature analysis (IncSFA), combining candid covariance-free incremental principal components analysis (CCIPCA) and covariance-free incremental minor components analysis (CIMCA). IncSFA's feature updating complexity is linear with respect to the input dimensionality,while batch SFA's (BSFA) updating complexity is cubic. IncSFAdoes not need to store, or even compute, any covariance matrices. The drawback to IncSFA is data efficiency: it does not use each data point as effectively as BSFA. But IncSFA allows SFA to be tractably applied, with just a few parameters, directly on highdimensional input streams (e.g., visual input of an autonomous agent), while BSFA has to resort to hierarchical receptive-field-based architectures when the input dimension is too high. Further, IncSFA's updates have simple Hebbian and anti-Hebbian forms, extending the biologicalplausibility of SFA. Experimental results show IncSFA learns the same set of features as BSFA and can handle a few cases where BSFA fails. © 2012 Massachusetts Institute of Technology.
UR - https://direct.mit.edu/neco/article/24/11/2994-3024/7824
UR - http://www.scopus.com/inward/record.url?scp=84867687400&partnerID=8YFLogxK
U2 - 10.1162/NECO_a_00344
DO - 10.1162/NECO_a_00344
M3 - Article
SN - 1530-888X
VL - 24
SP - 2994
EP - 3024
JO - Neural Computation
JF - Neural Computation
IS - 11
ER -