Jaakko Peltonen, Jacob Goldberger, and Samuel Kaski.
Fast Discriminative Component Analysis for Comparing Examples.
In NIPS 2006 workshop on Learning to Compare Examples, December 8, Whistler, Canada. (pdf)

Two recent methods, Neighborhood Components Analysis (NCA) and Informative Discriminant Analysis (IDA), search for a class-discriminative subspace or discriminative components of data, equivalent to learning of distance metrics invariant to changes perpendicular to the subspace. Constraining metrics to a subspace is useful for regularizing the metrics, and for dimensionality reduction. We introduce a variant of NCA and IDA that reduces their computational complexity from quadratic to linear in the number of data samples, by replacing their purely non-parametric class density estimates with semiparametric mixtures of Gaussians. In terms of accuracy, the method is shown to perform as well as NCA on benchmark data sets, outperforming several popular linear dimensionality reduction methods.



S. Kaski and J. Peltonen belong to the Adaptive Informatics Research Centre, a national centre of excellence of the Academy of Finland. They were supported by grant 108515, and by University of Helsinki's Research Funds. This work was also supported in part by the IST Programme of the European Community, under the PASCAL Network of Excellence, IST-2002-506778. This publication only reflects the authors views. All rights are reserved because of other commitments.