# MathSciDoc: An Archive for Mathematician ∫

#### Numerical Analysis and Scientific ComputingOptimization and Controlmathscidoc:1703.25021

SIAM Journal on Optimization, 2017
Motivated by big data applications, first-order methods have been extremely popular in recent years. However, naive gradient methods generally converge slowly. Hence, much efforts have been made to accelerate various first-order methods. This paper proposes two accelerated methods towards solving structured linearly constrained convex programming, for which we assume composite convex objective that is the sum of a differentiable function and a possibly nondifferentiable one. The first method is the accelerated linearized augmented Lagrangian method (LALM). At each update to the primal variable, it allows linearization to the differentiable function and also the augmented term, and thus it enables easy subproblems. Assuming merely convexity, we show that LALM owns O(1/t) convergence if parameters are kept fixed during all the iterations and can be accelerated to O(1/t^2) if the parameters are adapted, where t is the number of total iterations. The second method is the accelerated linearized alternating direction method of multipliers (LADMM). In addition to the composite convexity, it further assumes two-block structure on the objective. Different from classic ADMM, our method allows linearization to the objective and also augmented term to make the update simple. Assuming strong convexity on one block variable, we show that LADMM also enjoys O(1=t2) convergence with adaptive parameters. This result is a significant improvement over that in [Goldstein et. al, SIIMS'14], which requires strong convexity on both block variables and no linearization to the objective or augmented term. Numerical experiments are performed on quadratic programming, image denoising, and support vector machine. The proposed accelerated methods are compared to nonaccelerated ones and also existing accelerated methods. The results demonstrate the validness of acceleration and superior performance of the proposed methods over existing ones.
acceleration, linearization, first-order method, augmented Lagrangian method, alternating direction method of multipliers
• To appear
```@inproceedings{yangyang2017accelerated,
title={Accelerated first-order primal-dual proximal methods for linearly constrained composite convex programming},
author={Yangyang Xu},
url={http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20170314093357130655690},
booktitle={SIAM Journal on Optimization},
year={2017},
}
```
Yangyang Xu. Accelerated first-order primal-dual proximal methods for linearly constrained composite convex programming. 2017. In SIAM Journal on Optimization. http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20170314093357130655690.