Optimistic semi-supervised least squares classification

Jesse H. Krijthe, Marco Loog

4 Citations (Scopus)

Abstract

The goal of semi-supervised learning is to improve supervised classifiers by using additional unlabeled training examples. In this work we study a simple self-learning approach to semi-supervised learning applied to the least squares classifier. We show that a soft-label and a hard-label variant of self-learning can be derived by applying block coordinate descent to two related but slightly different objective functions. The resulting soft-label approach is related to an idea about dealing with missing data that dates back to the 1930s. We show that the soft-label variant typically outperforms the hard-label variant on benchmark datasets and partially explain this behaviour by studying the relative difficulty of finding good local minima for the corresponding objective functions.

Original languageEnglish
Title of host publication23rd International Conference on Pattern Recognition, ICPR 2016
Number of pages6
PublisherIEEE
Publication date1 Jan 2016
Pages1677-1682
ISBN (Electronic)978-1-5090-4847-2
DOIs
Publication statusPublished - 1 Jan 2016
Event23rd International Conference on Pattern Recognition - Cancun, Mexico
Duration: 4 Dec 20168 Dec 2016
Conference number: 23

Conference

Conference23rd International Conference on Pattern Recognition
Number23
Country/TerritoryMexico
CityCancun
Period04/12/201608/12/2016

Fingerprint

Dive into the research topics of 'Optimistic semi-supervised least squares classification'. Together they form a unique fingerprint.

Cite this