Samuel Kaski and Janne Sinkkonen. Metrics that learn relevance. In Proceedings of IJCNN-2000, International Joint Conference on Neural Networks, volume V, pages 547--552. IEEE Service Center, Piscataway, NJ, 2000. (postscript, gzipped postscript, errata)

We introduce an algorithm for learning a local metric to a continuous input space that measures distances in terms of relevance to the processing task. The relevance is defined as local changes in discrete auxiliary information, which may be for example the class of the data items, an index of performance, or a contextual input. A set of neurons first learns representations that maximize the mutual information between their outputs and the random variable representing the auxiliary information. The implicit knowledge gained about relevance is then transformed into a new metric of the input space that measures the change in the auxiliary information in the sense of local approximations to the Kullback-Leibler divergence. The new metric can be used in further processing by other algorithms. It is especially useful in data analysis applications since the distances can be interpreted in terms of the local relevance of the original variables.

Back to my online publications


Sami Kaski <sami.kaski@hut.fi>
Last modified: Wed Mar 9 08:36:03 EET 2005