Computational complexity of linear large margin classification with ramp loss

Søren Frejstrup Maibing, Christian Igel

2 Citationer (Scopus)

Abstract

Minimizing the binary classification error with a linear model leads to an NP-hard problem. In practice, surrogate loss functions are used, in particular loss functions leading to large margin classification such as the hinge loss and the ramp loss. The intuitive large margin concept is theoretically supported by generalization bounds linking the expected classification error to the empirical margin error and the complexity of the considered hypotheses class. This article addresses the fundamental question about the computational complexity of determining whether there is a hypotheses class with a hypothesis such that the upper bound on the generalization error is below a certain value. Results of this type are important for model comparison and selection. This paper takes a first step and proves that minimizing a basic margin-bound is NP-hard when considering linear hypotheses and the p-margin loss function, which generalizes the ramp loss. This result directly implies the hardness of ramp loss minimization.

OriginalsprogEngelsk
TitelProceedings of the 18th International Conference on Artificial Intelligence and Statistics (AISTATS) 2015
RedaktørerGuy Lebanon, S. V. N. Vishwanathan
Antal sider9
Publikationsdato2015
Sider259-267
StatusUdgivet - 2015
BegivenhedInternational Conference on Artificial Intelligence and Statistics 2015 - San Diego, Cal., USA
Varighed: 9 maj 201512 maj 2015
Konferencens nummer: 18

Konference

KonferenceInternational Conference on Artificial Intelligence and Statistics 2015
Nummer18
Land/OmrådeUSA
BySan Diego, Cal.
Periode09/05/201512/05/2015
NavnJMLR: Workshop and Conference Proceedings
Vol/bind38

Fingeraftryk

Dyk ned i forskningsemnerne om 'Computational complexity of linear large margin classification with ramp loss'. Sammen danner de et unikt fingeraftryk.

Citationsformater