Abstract
Learning covariance matrices of Gaussian distributions is at the heart of most variable-metric randomized algorithms for continuous optimization. If the search space dimensionality is high, updating the covariance or its factorization is computationally expensive. Therefore, we adopt an algorithm from numerical mathematics for rank-one updates of Cholesky factors. Our methods results in a quadratic time covariance matrix update scheme with minimal memory requirements. The numerically stable algorithm leads to triangular Cholesky factors. Systems of linear equations where the linear transformation is defined by a triangular matrix can be solved in quadratic time. This can be exploited to avoid the additional iterative update of the inverse Cholesky factor required in some covariance matrix adaptation algorithms proposed in the literature. When used together with the (1+1)-CMA-ES and the multi-objective CMA-ES, the new method leads to a memory reduction by a factor of almost four and a faster covariance matrix update. The numerical stability and runtime improvements are demonstrated on a set of benchmark functions.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII |
Number of pages | 8 |
Publisher | Association for Computing Machinery |
Publication date | 17 Jan 2015 |
Pages | 129-136 |
ISBN (Print) | 978-1-4503-3434-1 |
DOIs | |
Publication status | Published - 17 Jan 2015 |
Event | ACM Conference on Foundations of Genetic Algorithms 2015 - Aberystwyth, Wales, United Kingdom Duration: 17 Jan 2015 → 20 Jan 2015 Conference number: 13 |
Conference
Conference | ACM Conference on Foundations of Genetic Algorithms 2015 |
---|---|
Number | 13 |
Country/Territory | United Kingdom |
City | Aberystwyth, Wales |
Period | 17/01/2015 → 20/01/2015 |