Optimistic semi-supervised least squares classification

Jesse H. Krijthe, Marco Loog

4 Citationer (Scopus)

Abstract

The goal of semi-supervised learning is to improve supervised classifiers by using additional unlabeled training examples. In this work we study a simple self-learning approach to semi-supervised learning applied to the least squares classifier. We show that a soft-label and a hard-label variant of self-learning can be derived by applying block coordinate descent to two related but slightly different objective functions. The resulting soft-label approach is related to an idea about dealing with missing data that dates back to the 1930s. We show that the soft-label variant typically outperforms the hard-label variant on benchmark datasets and partially explain this behaviour by studying the relative difficulty of finding good local minima for the corresponding objective functions.

OriginalsprogEngelsk
Titel23rd International Conference on Pattern Recognition, ICPR 2016
Antal sider6
ForlagIEEE
Publikationsdato1 jan. 2016
Sider1677-1682
ISBN (Elektronisk)978-1-5090-4847-2
DOI
StatusUdgivet - 1 jan. 2016
Begivenhed23rd International Conference on Pattern Recognition - Cancun, Mexico
Varighed: 4 dec. 20168 dec. 2016
Konferencens nummer: 23

Konference

Konference23rd International Conference on Pattern Recognition
Nummer23
Land/OmrådeMexico
ByCancun
Periode04/12/201608/12/2016

Fingeraftryk

Dyk ned i forskningsemnerne om 'Optimistic semi-supervised least squares classification'. Sammen danner de et unikt fingeraftryk.

Citationsformater