TY - JOUR
T1 - First-Order Methods for Convex Optimization
AU - Dvurechensky, P.
AU - Shtern, S.
AU - Staudigl, M.
PY - 2021
Y1 - 2021
N2 - First-order methods for solving convex optimization problems have been at the forefront of mathematical optimization in the last 20 years. The rapid development of this important class of algorithms is motivated by the success stories reported in various applications, including most importantly machine learning, signal processing, imaging and control theory. First-order methods have the potential to provide low accuracy solutions at low computational complexity which makes them an attractive set of tools in large-scale optimization problems. In this survey, we cover a number of key developments in gradient-based optimization methods. This includes non-Euclidean extensions of the classical proximal gradient method, and its accelerated versions. Additionally we survey recent developments within the class of projection-free methods, and proximal versions of primal dual schemes. We give complete proofs for various key results, and highlight the unifying aspects of several optimization algorithms.
AB - First-order methods for solving convex optimization problems have been at the forefront of mathematical optimization in the last 20 years. The rapid development of this important class of algorithms is motivated by the success stories reported in various applications, including most importantly machine learning, signal processing, imaging and control theory. First-order methods have the potential to provide low accuracy solutions at low computational complexity which makes them an attractive set of tools in large-scale optimization problems. In this survey, we cover a number of key developments in gradient-based optimization methods. This includes non-Euclidean extensions of the classical proximal gradient method, and its accelerated versions. Additionally we survey recent developments within the class of projection-free methods, and proximal versions of primal dual schemes. We give complete proofs for various key results, and highlight the unifying aspects of several optimization algorithms.
KW - Convex Optimization
KW - Composite Optimization
KW - First-Order Methods
KW - Numerical Algorithms
KW - Convergence Rate
KW - Proximal Mapping
KW - Proximity Operator
KW - Bregman Divergence
KW - STOCHASTIC COMPOSITE OPTIMIZATION
KW - PROJECTED SUBGRADIENT METHODS
KW - INTERMEDIATE GRADIENT-METHOD
KW - COORDINATE DESCENT METHODS
KW - VARIATIONAL-INEQUALITIES
KW - MIRROR DESCENT
KW - FRANK-WOLFE
KW - APPROXIMATION ALGORITHMS
KW - THRESHOLDING ALGORITHM
KW - MINIMIZATION ALGORITHM
U2 - 10.1016/j.ejco.2021.100015
DO - 10.1016/j.ejco.2021.100015
M3 - Article
SN - 2192-4406
VL - 9
JO - EURO Journal on Computational Optimization
JF - EURO Journal on Computational Optimization
M1 - 100015
ER -