Motivated by big data applications, first-order methods have been extremely popular
in recent years. However, naive gradient methods generally converge slowly. Hence, much efforts have
been made to accelerate various first-order methods. This paper proposes two accelerated methods
towards solving structured linearly constrained convex programming, for which we assume composite
convex objective that is the sum of a differentiable function and a possibly nondifferentiable one.
The first method is the accelerated linearized augmented Lagrangian method (LALM). At each
update to the primal variable, it allows linearization to the differentiable function and also the
augmented term, and thus it enables easy subproblems. Assuming merely convexity, we show that
LALM owns O(1/t) convergence if parameters are kept fixed during all the iterations and can be
accelerated to O(1/t^2) if the parameters are adapted, where t is the number of total iterations.
The second method is the accelerated linearized alternating direction method of multipliers
(LADMM). In addition to the composite convexity, it further assumes two-block structure on the
objective. Different from classic ADMM, our method allows linearization to the objective and also
augmented term to make the update simple. Assuming strong convexity on one block variable, we
show that LADMM also enjoys O(1=t2) convergence with adaptive parameters. This result is a
significant improvement over that in [Goldstein et. al, SIIMS'14], which requires strong convexity
on both block variables and no linearization to the objective or augmented term.
Numerical experiments are performed on quadratic programming, image denoising, and support
vector machine. The proposed accelerated methods are compared to nonaccelerated ones and also
existing accelerated methods. The results demonstrate the validness of acceleration and superior
performance of the proposed methods over existing ones.