Abstract
The covariance matrix adaptation evolution strategy (CMA-ES) is arguably one of the most powerful real-valued derivative-free optimization algorithms, finding many applications in machine learning. The CMA-ES is a Monte Carlo method, sampling from a sequence of multi-variate Gaussian distributions. Given the function values at the sampled points, updating and storing the covariance matrix dominates the time and space complexity in each iteration of the algorithm. We propose a numerically stable quadratic-time covariance matrix update scheme with minimal memory requirements based on maintaining triangular Cholesky factors. This requires a modification of the cumulative step-size adaption (CSA) mechanism in the CMA-ES, in which we replace the inverse of the square root of the covariance matrix by the inverse of the triangular Cholesky factor. Because the triangular Cholesky factor changes smoothly with the matrix square root, this modification does not change the behavior of the CMA-ES in terms of required objective function evaluations as verified empirically. Thus, the described algorithm can and should replace the standard CMA-ES if updating and storing the covariance matrix matters.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 29 (NIPS 2016) |
Editors | D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, R. Garnett |
Number of pages | 9 |
Publisher | Curran Associates, Inc. |
Publication date | 2016 |
Pages | 370-378 |
Publication status | Published - 2016 |
Event | 30th Annual Conference on Neural Information Processing Systems - Barcelona, Spain Duration: 5 Dec 2016 → 10 Dec 2016 Conference number: 30 |
Conference
Conference | 30th Annual Conference on Neural Information Processing Systems |
---|---|
Number | 30 |
Country/Territory | Spain |
City | Barcelona |
Period | 05/12/2016 → 10/12/2016 |