Abstract
In this paper, we show that the largest and smallest eigenvalues of a sample correlation matrix stemming from n independent observations of a p-dimensional time series with iid components converge almost surely to (1+γ)2 and (1−γ)2, respectively, as n→∞ if p∕n→γ∈(0,1] and the truncated variance of the entry distribution is “almost slowly varying” a condition we describe via moment properties of self-normalized sums. Moreover, the empirical spectral distributions of these sample correlation matrices converge weakly, with probability 1, to the Marčenko–Pastur law, which extends a result in Bai and Zhou (2008). We compare the behavior of the eigenvalues of the sample covariance and sample correlation matrices and argue that the latter seems more robust, in particular in the case of infinite fourth moment. We briefly address some practical issues for the estimation of extreme eigenvalues in a simulation study. In our proofs we use the method of moments combined with a Path-Shortening Algorithm, which efficiently uses the structure of sample correlation matrices, to calculate precise bounds for matrix norms. We believe that this new approach could be of further use in random matrix theory.
Original language | English |
---|---|
Journal | Stochastic Processes and Their Applications |
Volume | 128 |
Issue number | 8 |
Pages (from-to) | 2779-2815 |
Number of pages | 37 |
ISSN | 0304-4149 |
DOIs | |
Publication status | Published - Aug 2018 |
Keywords
- Combinatorics
- Infinite fourth moment
- Largest eigenvalue
- Primary
- Regular variation
- Sample correlation matrix
- Sample covariance matrix
- Secondary
- Self-normalization
- Smallest eigenvalue
- Spectral distribution