Global Convergence of ADMM in Nonconvex Nonsmooth Optimization

Yu Wang University of California, Berkeley Wotao Yin University of California, Los Angeles Jinshan Zeng Jiangxi Normal University

Optimization and Control mathscidoc:1903.27001

Journal of Scientific Computing, 78, 2019.1
In this paper, we analyze the convergence of the alternating direction method of multipliers (ADMM) for minimizing a nonconvex and possibly nonsmooth objective function, φ(x0, . . . , x p, y), subject to coupled linear equality constraints. Our ADMM updates each of the primal variables x0, . . . , x p, y, followed by updating the dual variable.We separate the variable y from xi ’s as it has a special role in our analysis. The developed convergence guarantee covers a variety of nonconvex functions such as piecewise linear functions, lq quasi-norm, Schatten-q quasi-norm (0 < q < 1), minimax concave penalty (MCP), and smoothly clipped absolute deviation penalty. It also allows nonconvex constraints such as compact manifolds (e.g., spherical, Stiefel, and Grassman manifolds) and linear complementarity constraints. Also, the x0-block can be almost any lower semi-continuous function. By applying our analysis, we show, for the first time, that several ADMM algorithms applied to solve nonconvex models in statistical learning, optimization on manifold, and matrix decomposition are guaranteed to converge. Our results provide sufficient conditions for ADMM to converge on (convex or nonconvex) monotropic programs with three or more blocks, as they are special cases of our model. ADMM has been regarded as a variant to the augmented Lagrangian method (ALM). We present a simple example to illustrate how ADMM converges but ALM diverges with bounded penalty parameter β. Indicated by this example and other analysis in this paper, ADMM might be a better choice than ALM for some nonconvex nonsmooth problems, because ADMM is not only easier to implement, it is also more likely to converge for the concerned scenarios.
ADMM · Nonconvex optimization · Augmented Lagrangian method · Block coordinate descent · Sparse optimization
[ Download ] [ 2019-03-19 20:13:08 uploaded by JinshanZeng ] [ 1221 downloads ] [ 0 comments ]
@inproceedings{yu2019global,
  title={Global Convergence of ADMM in Nonconvex Nonsmooth Optimization},
  author={Yu Wang, Wotao Yin, and Jinshan Zeng},
  url={http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20190319201308904192204},
  booktitle={Journal of Scientific Computing},
  volume={78},
  year={2019},
}
Yu Wang, Wotao Yin, and Jinshan Zeng. Global Convergence of ADMM in Nonconvex Nonsmooth Optimization. 2019. Vol. 78. In Journal of Scientific Computing. http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20190319201308904192204.
Please log in for comment!
 
 
Contact us: office-iccm@tsinghua.edu.cn | Copyright Reserved