Asymptotic Theory for Graphical SLOPE: Precision Estimation and Pattern Convergence
Ivan Hejný, Giovanni Bonaccolto, Philipp Kremer, Sandra Paterlini, Małgorzata Bogdan + 1 more
TLDR
This paper develops asymptotic theory for Graphical SLOPE, showing its precision in estimating structured edge patterns and introducing TSLOPE for heavy-tailed data.
Key contributions
- Establishes asymptotic theory for Graphical SLOPE's precision matrix estimation and pattern convergence.
- Shows GSLOPE outperforms GLASSO in accuracy for structured edge patterns.
- Quantifies variability inflation for GSLOPE under elliptical (non-Gaussian) distributions.
- Introduces TSLOPE, demonstrating its superior performance for heavy-tailed data.
Why it matters
This paper provides a robust theoretical foundation for Graphical SLOPE, demonstrating its ability to accurately estimate clustered dependencies in precision matrices. It introduces TSLOPE, a significant advancement for handling heavy-tailed data, which is common in real-world applications. This improves the reliability and applicability of graphical models for complex data analysis.
Original Abstract
This paper studies Graphical SLOPE for precision matrix estimation, with emphasis on its ability to recover both sparsity and clusters of edges with equal or similar strength. In a fixed-dimensional regime, we establish that the root-$n$ scaled estimation error converges to the unique minimizer of a strictly convex optimization problem defined through the directional derivative of the SLOPE penalty. We also establish convergence of the induced SLOPE pattern, thereby obtaining an asymptotic characterization of the clustering structure selected by the estimator. A comparison with GLASSO shows that the grouping property of SLOPE can substantially improve estimation accuracy when the precision matrix exhibits structured edge patterns. To assess the effect of departures from Gaussianity, we then analyze Gaussian-loss precision matrix estimation under elliptical distributions. In this setting, we derive the limiting distribution and quantify the inflation in variability induced by heavy tails relative to the Gaussian benchmark. We also study TSLOPE, based on the multivariate $t$-loss, and derive its limiting distribution. The results show that TSLOPE offers clear advantages over GSLOPE under heavy-tailed data-generating mechanisms. Simulation evidence suggests that these qualitative conclusions persist in high-dimensional settings, and an empirical application shows that SLOPE-based estimators, especially TSLOPE, can uncover economically meaningful clustered dependence structures.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.