Samuel Kaski and Jaakko Peltonen.
Informative discriminant analysis.
In: Tom Fawcett and Nina Mishra, editors, Proceedings of the Twentieth International Conference on Machine
Learning (ICML-2003), pp. 329-336, AAAI Press, Menlo Park, CA, 2003.
(postscript,
gzipped postscript, pdf)
We introduce a probabilistic model that generalizes classical linear
discriminant analysis and gives an interpretation for the components
as informative or relevant components of data. The components
maximize the predictability of class distribution which is
asymptotically equivalent to (i) maximizing mutual information with
the classes, and (ii) finding principal components in the so-called
learning or Fisher metrics. The Fisher metric measures only
distances that are relevant to the classes, that is, distances that
cause changes in the class distribution. The components have
applications in data exploration, visualization, and dimensionality
reduction. In empirical experiments the method outperformed a Renyi
entropy-based alternative and linear discriminant analysis.