MathSciDoc: An Archive for Mathematician ∫

Statistics Theory and Methodsmathscidoc:1912.43351

arXiv preprint arXiv:1603.03516
In statistics and machine learning, people are often interested in the eigenvectors (or singular vectors) of certain matrices (eg covariance matrices, data matrices, etc). However, those matrices are usually perturbed by noises or statistical errors, either from random sampling or structural patterns. One usually employs Davis-Kahan \sin theorem to bound the difference between the eigenvectors of a matrix \sin and those of a perturbed matrix \sin , in terms of \sin norm. In this paper, we prove that when \sin is a low-rank and incoherent matrix, the \sin norm perturbation bound of singular vectors (or eigenvectors in the symmetric case) is smaller by a factor of \sin or \sin for left and right vectors, where \sin and \sin are the matrix dimensions. The power of this new perturbation result is shown in robust covariance estimation, particularly when random variables have heavy tails. There, we propose new robust covariance estimators and establish their asymptotic properties using the newly developed perturbation bound. Our theoretical results are verified through extensive numerical experiments.
@inproceedings{jianqingan,