Abstract
Motivated by modern regression applications, in this paper, we study the convexification of a class of convex optimization problems with indicator variables and combinatorial constraints on the indicators. Unlike most of the previous work on convexification of sparse regression problems, we simultaneously consider the nonlinear non-separable objective, indicator variables, and combinatorial constraints. Specifically, we give the convex hull description of the epigraph of the composition of a one-dimensional convex function and an affine function under arbitrary combinatorial constraints. As special cases of this result, we derive ideal convexifications for problems with hierarchy, multi-collinearity, and sparsity constraints. Moreover, we also give a short proof that for a separable objective function, the perspective reformulation is ideal independent from the constraints of the problem. Our computational experiments with sparse regression problems demonstrate the potential of the proposed approach in improving the relaxation quality without significant computational overhead.
Original language | English (US) |
---|---|
Pages (from-to) | 57-88 |
Number of pages | 32 |
Journal | Mathematical Programming |
Volume | 192 |
Issue number | 1-2 |
DOIs | |
State | Published - Mar 2022 |
Funding
We thank the AE and two referees whose comments expanded and improved our computational study, and also led to the result in Appendix 1. This research is supported, in part, by ONR grant N00014-19-1-2321, and NSF grants 1818700, 2006762, and 2007814.
Keywords
- Combinatorial constraints
- Convexification
- Indicator variables
- Perspective formulation
ASJC Scopus subject areas
- Software
- General Mathematics