To summarize, the choice of the ICA algorithm is basically a choice between adaptive and batch-mode (block) algorithms.
In the adaptive case, the algorithms are obtained by stochastic gradient methods. In the case where all the independent components are estimated at the same time, the most popular algorithm in this category is natural gradient ascent of likelihood, or related contrast functions, like infomax [1,2,12,33,28,26] In the one-unit case, straightforward stochastic gradient methods give adaptive algorithms that maximize negentropy or its approximations [40,71,103,73].
In the case where the computations are made in batch-mode, much more efficient algorithms are available. The tensor-based methods [29,36] are efficient in small dimensions, but they cannot be used in larger dimensions. The FastICA algorithm, based on a fixed-point iteration, is a very efficient batch algorithm that can be used to maximize both one-unit contrast functions [72,60,65] and multi-unit contrast functions, including likelihood .