A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization

N. Keskar, Andreas Waechter*

*Corresponding author for this work

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

We consider the problem of minimizing a continuous function that may be non-smooth and non-convex, subject to bound constraints. We propose an algorithm that uses the L-BFGS quasi-Newton approximation of the problem's curvature together with a variant of the weak Wolfe line search. The key ingredient of the method is an active-set selection strategy that defines the subspace in which search directions are computed. To overcome the inherent shortsightedness of the gradient for a non-smooth function, we propose two strategies. The first relies on an approximation of the ε-minimum norm subgradient, and the second uses an iterative corrective loop that augments the active set based on the resulting search directions. While theoretical convergence guarantees have been elusive even for the unconstrained case, we present numerical results on a set of standard test problems to illustrate the efficacy of our approach, using an open-source Python implementation of the proposed algorithm.

Original languageEnglish (US)
Pages (from-to)150-171
Number of pages22
JournalOptimization Methods and Software
Volume34
Issue number1
DOIs
StatePublished - Jan 2 2019

Fingerprint

Quasi-Newton Algorithm
Active Set
Nonsmooth Optimization
Constrained Optimization
Data storage equipment
Bound Constraints
Quasi-Newton
Nonsmooth Function
Python
Subgradient
Line Search
Approximation
Open Source
Test Problems
Efficacy
Continuous Function
Curvature
Subspace
Gradient
Norm

Keywords

  • L-BFGS
  • Non-smooth optimization
  • active-set correction
  • active-set method
  • bound constraints
  • quasi-Newton

ASJC Scopus subject areas

  • Software
  • Control and Optimization
  • Applied Mathematics

Cite this

@article{1623abea0298477d97a71b27148ddf69,
title = "A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization",
abstract = "We consider the problem of minimizing a continuous function that may be non-smooth and non-convex, subject to bound constraints. We propose an algorithm that uses the L-BFGS quasi-Newton approximation of the problem's curvature together with a variant of the weak Wolfe line search. The key ingredient of the method is an active-set selection strategy that defines the subspace in which search directions are computed. To overcome the inherent shortsightedness of the gradient for a non-smooth function, we propose two strategies. The first relies on an approximation of the ε-minimum norm subgradient, and the second uses an iterative corrective loop that augments the active set based on the resulting search directions. While theoretical convergence guarantees have been elusive even for the unconstrained case, we present numerical results on a set of standard test problems to illustrate the efficacy of our approach, using an open-source Python implementation of the proposed algorithm.",
keywords = "L-BFGS, Non-smooth optimization, active-set correction, active-set method, bound constraints, quasi-Newton",
author = "N. Keskar and Andreas Waechter",
year = "2019",
month = "1",
day = "2",
doi = "10.1080/10556788.2017.1378652",
language = "English (US)",
volume = "34",
pages = "150--171",
journal = "Optimization Methods and Software",
issn = "1055-6788",
publisher = "Taylor and Francis Ltd.",
number = "1",

}

A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization. / Keskar, N.; Waechter, Andreas.

In: Optimization Methods and Software, Vol. 34, No. 1, 02.01.2019, p. 150-171.

Research output: Contribution to journalArticle

TY - JOUR

T1 - A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization

AU - Keskar, N.

AU - Waechter, Andreas

PY - 2019/1/2

Y1 - 2019/1/2

N2 - We consider the problem of minimizing a continuous function that may be non-smooth and non-convex, subject to bound constraints. We propose an algorithm that uses the L-BFGS quasi-Newton approximation of the problem's curvature together with a variant of the weak Wolfe line search. The key ingredient of the method is an active-set selection strategy that defines the subspace in which search directions are computed. To overcome the inherent shortsightedness of the gradient for a non-smooth function, we propose two strategies. The first relies on an approximation of the ε-minimum norm subgradient, and the second uses an iterative corrective loop that augments the active set based on the resulting search directions. While theoretical convergence guarantees have been elusive even for the unconstrained case, we present numerical results on a set of standard test problems to illustrate the efficacy of our approach, using an open-source Python implementation of the proposed algorithm.

AB - We consider the problem of minimizing a continuous function that may be non-smooth and non-convex, subject to bound constraints. We propose an algorithm that uses the L-BFGS quasi-Newton approximation of the problem's curvature together with a variant of the weak Wolfe line search. The key ingredient of the method is an active-set selection strategy that defines the subspace in which search directions are computed. To overcome the inherent shortsightedness of the gradient for a non-smooth function, we propose two strategies. The first relies on an approximation of the ε-minimum norm subgradient, and the second uses an iterative corrective loop that augments the active set based on the resulting search directions. While theoretical convergence guarantees have been elusive even for the unconstrained case, we present numerical results on a set of standard test problems to illustrate the efficacy of our approach, using an open-source Python implementation of the proposed algorithm.

KW - L-BFGS

KW - Non-smooth optimization

KW - active-set correction

KW - active-set method

KW - bound constraints

KW - quasi-Newton

UR - http://www.scopus.com/inward/record.url?scp=85030556641&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85030556641&partnerID=8YFLogxK

U2 - 10.1080/10556788.2017.1378652

DO - 10.1080/10556788.2017.1378652

M3 - Article

AN - SCOPUS:85030556641

VL - 34

SP - 150

EP - 171

JO - Optimization Methods and Software

JF - Optimization Methods and Software

SN - 1055-6788

IS - 1

ER -