A family of second-order methods for convex ℓ1 -regularized optimization

Richard H. Byrd, Gillian M. Chin*, Jorge Nocedal, Figen Oztoprak

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

22 Scopus citations


This paper is concerned with the minimization of an objective that is the sum of a convex function f and an ℓ1 regularization term. Our interest is in active-set methods that incorporate second-order information about the function f to accelerate convergence. We describe a semismooth Newton framework that can be used to generate a variety of second-order methods, including block active set methods, orthant-based methods and a second-order iterative soft-thresholding method. The paper proposes a new active set method that performs multiple changes in the active manifold estimate at every iteration, and employs a mechanism for correcting these estimates, when needed. This corrective mechanism is also evaluated in an orthant-based method. Numerical tests comparing the performance of three active set methods are presented.

Original languageEnglish (US)
Pages (from-to)435-467
Number of pages33
JournalMathematical Programming
Issue number1-2
StatePublished - Sep 1 2016


  • 49M37
  • 65K05
  • 90C30

ASJC Scopus subject areas

  • Software
  • Mathematics(all)


Dive into the research topics of 'A family of second-order methods for convex ℓ1 -regularized optimization'. Together they form a unique fingerprint.

Cite this