Abstract
Radial kernels are well-suited for machine learning over general geodesic metric spaces, where pairwise distances are often the only computable quantity available. We have recently shown that geodesic exponential kernels are only positive definite for all bandwidths when the input space has strong linear properties. This negative result hints that radial kernel are perhaps not suitable over geodesic metric spaces after all. Here, however, we present evidence that large intervals of bandwidths exist where geodesic exponential kernels have high probability of being positive definite over finite datasets, while still having significant predictive power. From this we formulate conjectures on the probability of a positive definite kernel matrix for a finite random sample, depending on the geometry of the data space and the spread of the sample.
Original language | English |
---|---|
Title of host publication | 29th Annual Conference on Learning Theory |
Editors | Vitaly Feldman, Alexander Rakhlin, Ohad Shamir |
Number of pages | 4 |
Publication date | 6 Jun 2016 |
Pages | 1647–1650 |
Publication status | Published - 6 Jun 2016 |
Event | 29th Conference on Learning Theory - New York, United States Duration: 23 Jun 2016 → 26 Jun 2016 Conference number: 29 |
Conference
Conference | 29th Conference on Learning Theory |
---|---|
Number | 29 |
Country/Territory | United States |
City | New York |
Period | 23/06/2016 → 26/06/2016 |
Series | JMLR: Workshop and Conference Proceedings |
---|---|
Volume | 49 |