Abstract
Radial kernels are well-suited for machine learning over general geodesic metric spaces, where pairwise distances are often the only computable quantity available. We have recently shown that geodesic exponential kernels are only positive definite for all bandwidths when the input space has strong linear properties. This negative result hints that radial kernel are perhaps not suitable over geodesic metric spaces after all. Here, however, we present evidence that large intervals of bandwidths exist where geodesic exponential kernels have high probability of being positive definite over finite datasets, while still having significant predictive power. From this we formulate conjectures on the probability of a positive definite kernel matrix for a finite random sample, depending on the geometry of the data space and the spread of the sample.
Originalsprog | Engelsk |
---|---|
Titel | 29th Annual Conference on Learning Theory |
Redaktører | Vitaly Feldman, Alexander Rakhlin, Ohad Shamir |
Antal sider | 4 |
Publikationsdato | 6 jun. 2016 |
Sider | 1647–1650 |
Status | Udgivet - 6 jun. 2016 |
Begivenhed | 29th Conference on Learning Theory - New York, USA Varighed: 23 jun. 2016 → 26 jun. 2016 Konferencens nummer: 29 |
Konference
Konference | 29th Conference on Learning Theory |
---|---|
Nummer | 29 |
Land/Område | USA |
By | New York |
Periode | 23/06/2016 → 26/06/2016 |
Navn | JMLR: Workshop and Conference Proceedings |
---|---|
Vol/bind | 49 |