Inferring feature relevances from metric learning

Alexander Schulz, Bassam Mokbel, Michael Biehl, Barbara Hammer

4 Citations (Scopus)

Abstract

Powerful metric learning algorithms have been proposed in the last years which do not only greatly enhance the accuracy of distance-based classifiers and nearest neighbor database retrieval, but which also enable the interpretability of these operations by assigning explicit relevance weights to the single data components. Starting with the work SSCI13Stretal, it has been noticed, however, that this procedure has very limited validity in the important case of high data dimensionality or high feature correlations: The resulting relevance profiles are random to a large extend, leading to invalid interpretation and fluctuations of its accuracy for novel data. While the work SSCI13Stretal proposes a first cure by means of L2-regularisation, it only preserves strongly relevant features, leaving weakly relevant and not necessarily unique features undetected. In this contribution, we enhance the technique by an efficient linear programming scheme which enables the unique identification of a relevance interval for every observed feature, this way identifying both, strongly and weakly relevant features for a given metric.

Original languageDanish
Title of host publicationProceedings - 2015 IEEE Symposium Series on Computational Intelligence : SSCI 2015
Number of pages8
PublisherIEEE
Publication date2015
Pages1599-1606
ISBN (Print)9781479975600
DOIs
Publication statusPublished - 2015
SeriesProceedings - 2015 IEEE Symposium Series on Computational Intelligence, SSCI 2015

Cite this