On Global Linear Convergence in Stochastic Nonconvex Optimization for Semidefinite Programming

Jinshan Zeng Jiangxi Normal University Ke Ma Institute of Information Engineering, Chinese Academy of Sciences Yuan Yao Hong Kong University of Science and Technology

Machine Learning mathscidoc:2004.41002

Distinguished Paper Award in 2020

IEEE TRANSACTIONS ON SIGNAL PROCESSING, 67, (16), 4261-4275, 2019.8
Nonconvex reformulations via low-rank factorization for stochastic convex semidefinite optimization problem have attracted arising attention due to their empirical efficiency and scalability. Compared with the original convex formulations, the nonconvex ones typically involve much fewer variables, allowing them to scale to scenarios with millions of variables. However, it opens a new challenge that under what conditions the nonconvex stochastic algorithms may find the population minimizer within the optimal statistical precision despite their empirical success in applications. In this paper, we provide an answer that the stochastic gradient descent (SGD) method can be adapted to solve the nonconvex reformulation of the original convex problem, with a global linear convergence when using a fixed step size, i.e., converging exponentially fast to the population minimizer within an optimal statistical precision in the restricted strongly convex case. If a diminishing step size is adopted, the bad effect caused by the variance of gradients on the optimization error can be eliminated but the rate is dropped to be sublinear. The core of our treatment relies on a novel second-order descent lemma, which is more general than the existing best result in the literature and improves the analysis on both online and batch algorithms. The developed theoretical results and effectiveness of the suggested SGD are also verified by a series of experiments.
Stochastic gradient descent, semidefinite optimization, low-rank factorization
[ Download ] [ 2020-04-27 09:53:32 uploaded by JinshanZeng ] [ 1228 downloads ] [ 0 comments ]
@inproceedings{jinshan2019on,
  title={On Global Linear Convergence in Stochastic Nonconvex Optimization for Semidefinite Programming},
  author={Jinshan Zeng, Ke Ma, and Yuan Yao},
  url={http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20200427095332329638657},
  booktitle={IEEE TRANSACTIONS ON SIGNAL PROCESSING},
  volume={67},
  number={16},
  pages={4261-4275},
  year={2019},
}
Jinshan Zeng, Ke Ma, and Yuan Yao. On Global Linear Convergence in Stochastic Nonconvex Optimization for Semidefinite Programming. 2019. Vol. 67. In IEEE TRANSACTIONS ON SIGNAL PROCESSING. pp.4261-4275. http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20200427095332329638657.
Please log in for comment!
 
 
Contact us: office-iccm@tsinghua.edu.cn | Copyright Reserved