Models and algorithms for distributionally robust least squares problems

Sanjay Mehrotra*, He Zhang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

28 Scopus citations


We present three different robust frameworks using probabilistic ambiguity descriptions of the data in least squares problems. These probability ambiguity descriptions are given by: (1) confidence region over the first two moments; (2) bounds on the probability measure with moments constraints; (3) the Kantorovich probability distance from a given measure. For the first case, we give an equivalent formulation and show that the optimization problem can be solved using a semidefinite optimization reformulation or polynomial time algorithms. For the second case, we derive the equivalent Lagrangian problem and show that it is a convex stochastic programming problem. We further analyze three special subcases: (i) finite support; (ii) measure bounds by a reference probability measure; (iii) measure bounds by two reference probability measures with known density functions. We show that case (i) has an equivalent semidefinite programming reformulation and the sample average approximations of case (ii) and (iii) have equivalent semidefinite programming reformulations. For ambiguity description (3), we show that the finite support case can be solved by using an equivalent second order cone programming reformulation.

Original languageEnglish (US)
Pages (from-to)123-141
Number of pages19
JournalMathematical Programming
Issue number1-2
StatePublished - Aug 2014


  • 62G35 Robustness
  • 62J05 Linear regression
  • 62J07 Ridge regression; shrinkage estimators
  • 90C15 Stochastic programming
  • 90C22 Semidefinite programming
  • 93B35 Sensitivity (robustness)
  • Mathematics Subject Classification: 93E24 Least squares and related methods

ASJC Scopus subject areas

  • Software
  • General Mathematics


Dive into the research topics of 'Models and algorithms for distributionally robust least squares problems'. Together they form a unique fingerprint.

Cite this