Sparsistency and rates of convergence in large covariance matrix estimation

Clifford Lam Jianqing Fan

Statistics Theory and Methods mathscidoc:1912.43262

Annals of statistics, 37, 4254, 2009
This paper studies the sparsistency and rates of convergence for estimating sparse covariance and precision matrices based on penalized likelihood with nonconvex penalty functions. Here, sparsistency refers to the property that all parameters that are zero are actually estimated as zero with probability tending to one. Depending on the case of applications, sparsity priori may occur on the covariance matrix, its inverse or its Cholesky decomposition. We study these three sparsity exploration problems under a unified framework with a general penalty function. We show that the rates of convergence for these problems under the Frobenius norm are of order (s n log p n/n) 1/2, where s n is the number of nonzero elements, p n is the size of the covariance matrix and n is the sample size. This explicitly spells out the contribution of high-dimensionality is merely of a logarithmic factor. The conditions on the rate with which
No keywords uploaded!
[ Download ] [ 2019-12-21 11:33:36 uploaded by Jianqing_Fan ] [ 522 downloads ] [ 0 comments ]
@inproceedings{clifford2009sparsistency,
  title={Sparsistency and rates of convergence in large covariance matrix estimation},
  author={Clifford Lam, and Jianqing Fan},
  url={http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20191221113336780830822},
  booktitle={Annals of statistics},
  volume={37},
  pages={4254},
  year={2009},
}
Clifford Lam, and Jianqing Fan. Sparsistency and rates of convergence in large covariance matrix estimation. 2009. Vol. 37. In Annals of statistics. pp.4254. http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20191221113336780830822.
Please log in for comment!
 
 
Contact us: office-iccm@tsinghua.edu.cn | Copyright Reserved