An algorithm for quadratic ℓ1-regularized optimization with a flexible active-set strategy

Stefan Solntsev*, Jorge Nocedal, Richard H. Byrd

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

We present an active-set method for minimizing an objective that is the sum of a convex quadratic and regularization term. Unlike two-phase methods that combine a first-order active set identification step and a subspace phase consisting of a cycle of conjugate gradient (CG) iterations, the method presented here has the flexibility of computing a first-order proximal gradient step or a subspace CG step at each iteration. The decision of which type of step to perform is based on the relative magnitudes of some scaled components of the minimum norm subgradient of the objective function. The paper establishes global rates of convergence, as well as work complexity estimates for two variants of our approach, which we call the interleaved iterative soft-thresholding algorithm (ISTA)-CG method. Numerical results illustrating the behaviour of the method on a variety of test problems are presented.

Original languageEnglish (US)
Pages (from-to)1213-1237
Number of pages25
JournalOptimization Methods and Software
Volume30
Issue number6
DOIs
StatePublished - Nov 2 2015

Keywords

  • active-set method
  • convex optimization
  • lasso
  • nonlinear optimization

ASJC Scopus subject areas

  • Software
  • Control and Optimization
  • Applied Mathematics

Fingerprint Dive into the research topics of 'An algorithm for quadratic ℓ<sub>1</sub>-regularized optimization with a flexible active-set strategy'. Together they form a unique fingerprint.

Cite this