The fastest known globally convergent first-order method for minimizing strongly convex functions

Bryan Von Scoy*, Randy A. Freeman, Kevin M. Lynch

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

70 Scopus citations

Abstract

We design and analyze a novel gradient-based algorithm for unconstrained convex optimization. When the objective function is m-strongly convex and its gradient is L-Lipschitz continuous, the iterates and function values converge linearly to the optimum at rates p and p2, respectively, where p = 1 - √m/L. These are the fastest known guaranteed linear convergence rates for globally convergent first-order methods, and for high desired accuracies the corresponding iteration complexity is within a factor of two of the theoretical lower bound. We use a simple graphical design procedure based on integral quadratic constraints to derive closed-form expressions for the algorithm parameters. The new algorithm, which we call the triple momentum method, can be seen as an extension of methods such as gradient descent, Nesterov's accelerated gradient descent, and the heavy-ball method.

Original languageEnglish (US)
Pages (from-to)49-54
Number of pages6
JournalIEEE Control Systems Letters
Volume2
Issue number1
DOIs
StatePublished - Jan 2018

Keywords

  • Optimization algorithms
  • Robust control

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Control and Optimization

Fingerprint

Dive into the research topics of 'The fastest known globally convergent first-order method for minimizing strongly convex functions'. Together they form a unique fingerprint.

Cite this