A. Hyvärinen. Independent Component Analysis in the Presence of Gaussian Noise by Maximizing Joint Likelihood. Neurocomputing, Vol. 22, pp. 49-67, 1998. (Supersedes papers at I&ANN'98 workshop and at IJCNN'98).
Postscript  gzipped PostScript.

Abstract: We consider the estimation of the data model of independent component analysis when gaussian noise is present. We show that the joint maximum likelihood estimation of the independent components and the mixing matrix leads to an objective function already proposed by Olshausen and Field using a different derivation. Due to the complicated nature of the objective function, we introduce approximations that greatly simplify the optimization problem. We show that the presence of noise implies that the relation between the observed data and the estimates of the independent components is non-linear, and show how to approximate this non-linearity. In particular, the non-linearity may be approximated by a simple shrinkage operation in the case of supergaussian (sparse) data. Using these approximations, we propose an efficient algorithm for approximate maximization of the likelihood. In the case of supergaussian components, this may be approximated by simple competitive learning, and in the case of subgaussian components, by anti-competitive learning.

Back to my on-line publications