A common problem encountered
in such disciplines as statistics, data analysis, signal processing,
and neural network
research, is finding a suitable representation of multivariate
data. For computational and conceptual simplicity, such a
representation is often sought as a linear transformation
of the original data.
Well-known linear transformation methods include, for example,
principal component analysis, factor analysis, and projection pursuit.
A recently developed linear transformation
method is independent component analysis (ICA), in which the
desired representation is the one that minimizes the statistical dependence of
the components of the representation.
Such a representation seems to capture the essential structure of the
data in many applications.
In this paper, we survey the existing theory and methods for ICA.
Neural Computing Surveys, Vol. 2, pp. 94-128, 1999.
Keywords:
Independent component analysis,
blind
source separation,
factor analysis, data
analysis,
higher-order
statistics,
neural networks,
unsupervised learning,
Hebbian learning