Jaakko Peltonen, Arto Klami, and Samuel Kaski. Improved Learning of Riemannian Metrics for Exploratory Analysis. Neural Networks, vol. 17, pages 1087-1100, 2004. © Elsevier Ltd. (preprint gzipped postscript, Elsevier page linking the final paper, Erratum to final paper on Elsevier pages)

We have earlier introduced a principle for learning metrics, which shows how metric-based methods can be made to focus on discriminative properties of data. The main applications are in supervising unsupervised learning to model interesting variation in data, instead of modeling all variation as plain unsupervised learning does. The metrics are derived by approximations to an information-geometric formulation. In this paper we review the theory, introduce better approximations to the distances, and show how to apply them in two different kinds of unsupervised methods: prototype-based and pairwise-distance based. The two examples are self-organizing maps and multidimensional scaling (Sammon's mapping).

This work was supported in part by the IST Programme of the European Community, under the PASCAL Network of Excellence, IST-2002-506778. This publication only reflects the authors' views. The authors acknowledge that access rights to the materials produced in this project are restricted due to other commitments.