Neural Computation, 14(4):715-770 (2002-04-) (bibtex, paper.ps.gz)

Slow feature analysis: Unsupervised learning of invariances.

Laurenz Wiskott and Terrence J. Sejnowski


Abstract: Invariant features of temporally varying signals are useful for analysis and classification. Slow feature analysis (SFA) is a new method for learning invariant or slowly varying features from a vectorial input signal. SFA is based on a non-linear expansion of the input signal and application of principal component analysis to this expanded signal and its time derivative. It is guaranteed to find the optimal solution within a family of functions directly and can learn to extract a large number of decorrelated features, which are ordered by their degree of invariance. SFA can be applied hierarchically to process high dimensional input signals and to extract complex features. Slow feature analysis is applied first to complex cell tuning properties based on simple cell output including disparity and motion. Then, more complicated input-output functions are learned by repeated application of SFA. Finally, a hierarchical network of SFA-modules is presented as a simple model of the visual system. The same unstructured network can learn translation, size, rotation, contrast, or, to a lesser degree, illumination invariance for one-dimensional objects, depending only on the training stimulus. Surprisingly, only a few training objects sufficed to achieve good generalization to new objects. The generated representation is suitable for object recognition. Performance degrades, if the network is trained to learn multiple invariances simultaneously.


Relevant Project:


March 13, 2002, Laurenz Wiskott, http://www.neuroinformatik.ruhr-uni-bochum.de/PEOPLE/wiskott/