Abstract
We consider general Gaussian latent tree models in which the observed variables are not restricted to be leaves of the tree. Extending related recent work, we give a full semi-algebraic description of the set of covariance matrices of any such model. In other words, we find polynomial constraints that characterize when a matrix is the covariance matrix of a distribution in a given latent tree model. However, leveraging these constraints to test a given such model is often complicated by the number of constraints being large and by singularities of individual polynomials, which may invalidate standard approximations to relevant probability distributions. Illustrating with the star tree, we propose a new testing methodology that circumvents singularity issues by trading off some statistical estimation efficiency and handles cases with many constraints through recent advances on Gaussian approximation for maxima of sums of high-dimensional random vectors. Our test avoids the need to maximize the possibly multimodal likelihood function of such models and is applicable to models with larger number of variables. These points are illustrated in numerical experiments.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 31 (NIPS 2018) |
Number of pages | 10 |
Publisher | Neural Information Processing Systems Foundation |
Publication date | 2018 |
Publication status | Published - 2018 |
Event | Twenty-Second Annual Conference on Neural Information Processing Systems - Hyatt Regency, Vancouver, Canada Duration: 8 Dec 2008 → 13 Dec 2008 Conference number: 22 |
Conference
Conference | Twenty-Second Annual Conference on Neural Information Processing Systems |
---|---|
Number | 22 |
Location | Hyatt Regency |
Country/Territory | Canada |
City | Vancouver |
Period | 08/12/2008 → 13/12/2008 |