A. Hyvärinen and E. Oja. Independent Component Analysis by General Non-linear Hebbian-like Learning Rules.  Signal Processing,  Vol. 64, No. 3, pp. 301-313, 1998.
Postscript  gzipped PostScript .

Abstract: A number of neural learning rules have been recently proposed for Independent Component Analysis (ICA). The rules are usually derived from information-theoretic criteria such as maximum entropy or minimum mutual information. In this paper, we show that in fact, ICA can be performed by very simple Hebbian or anti-Hebbian learning rules, which may have only weak relations to such information-theoretical quantities. Rather suprisingly, practically any non-linear function can be used in the learning rule, provided only that the sign of the Hebbian/anti-Hebbian term is chosen correctly. In addition to the Hebbian-like mechanism, the weight vector is here constrained to have unit norm, and the data is preprocessed by prewhitening, or sphering. These results imply that one can choose the non-linearity so as to optimize desired statistical or numerical criteria.

Back to my on-line publications