Abstract
Regularized least-squares classification is one of the most promising alternatives to standard support vector machines, with the desirable property of closed-form solutions that can be obtained analytically, and efficiently. While the supervised, and mostly binary case has received tremendous attention in recent years, unsupervised multi-class settings have not yet been considered. In this work we present an efficient implementation for the unsupervised extension of the multi-class regularized least-squares classification framework, which is, to the best of the authors' knowledge, the first one in the literature addressing this task. The resulting kernel-based framework efficiently combines steepest descent strategies with powerful meta-heuristics for avoiding local minima. The computational efficiency of the overall approach is ensured through the application of matrix algebra shortcuts that render efficient updates of the intermediate candidate solutions possible. Our experimental evaluation indicates the potential of the novel method, and demonstrates its superior clustering performance over a variety of competing methods on real-world data sets.
Original language | English |
---|---|
Title of host publication | Proceedings - 12th IEEE International Conference on Data Mining, ICDM 2012 |
Number of pages | 10 |
Publisher | IEEE |
Publication date | 2012 |
Pages | 585-594 |
Article number | 6413868 |
ISBN (Print) | 978-1-4673-4649-8 |
DOIs | |
Publication status | Published - 2012 |
Externally published | Yes |
Event | 12th IEEE International Conference on Data Mining, ICDM 2012 - Brussels, Belgium Duration: 10 Dec 2012 → 13 Dec 2012 |
Conference
Conference | 12th IEEE International Conference on Data Mining, ICDM 2012 |
---|---|
Country/Territory | Belgium |
City | Brussels |
Period | 10/12/2012 → 13/12/2012 |
Keywords
- Maximum margin clustering
- Multi-class regularized least-squares classification
- Unsupervised learning