Abstract
We design and analyze a novel gradient-based algorithm for unconstrained convex optimization. When the objective function is m-strongly convex and its gradient is L-Lipschitz continuous, the iterates and function values converge linearly to the optimum at rates p and p2, respectively, where p = 1 - √m/L. These are the fastest known guaranteed linear convergence rates for globally convergent first-order methods, and for high desired accuracies the corresponding iteration complexity is within a factor of two of the theoretical lower bound. We use a simple graphical design procedure based on integral quadratic constraints to derive closed-form expressions for the algorithm parameters. The new algorithm, which we call the triple momentum method, can be seen as an extension of methods such as gradient descent, Nesterov's accelerated gradient descent, and the heavy-ball method.
Original language | English (US) |
---|---|
Pages (from-to) | 49-54 |
Number of pages | 6 |
Journal | IEEE Control Systems Letters |
Volume | 2 |
Issue number | 1 |
DOIs | |
State | Published - Jan 2018 |
Keywords
- Optimization algorithms
- Robust control
ASJC Scopus subject areas
- Control and Systems Engineering
- Control and Optimization