## Abstract

Finding efficient and provable methods to solve non-convex optimization problems is an outstanding challenge in machine learning and optimization theory. A popular approach used to tackle non-convex problems is to use convex relaxation techniques to find a convex surrogate for the problem. Unfortunately, convex relaxations typically must be found on a problemby-problem basis. Thus, providing a generalpurpose strategy to estimate a convex relaxation would have a wide reaching impact. Here, we introduce Convex Relaxation Regression (CoRR), an approach for learning convex relaxations for a class of smooth functions. The idea behind our approach is to estimate the convex envelope of a function f by evaluating f at a set of T random points and then fitting a convex function to these function evaluations. We prove that with probability greater than 1-δ, the solution of our algorithm converges to the global optimizer of f with error O (log(1/δ/T)^{α}) for some α > 0. Our approach enables the use of convex optimization tools to solve non-convex optimization problems.

Original language | English (US) |
---|---|

Title of host publication | 32nd Conference on Uncertainty in Artificial Intelligence 2016, UAI 2016 |

Publisher | Association For Uncertainty in Artificial Intelligence (AUAI) |

Pages | 22-31 |

Number of pages | 10 |

ISBN (Electronic) | 9781510827806 |

State | Published - Jan 1 2016 |

Event | 32nd Conference on Uncertainty in Artificial Intelligence 2016, UAI 2016 - Jersey City, United States Duration: Jun 25 2016 → Jun 29 2016 |

### Other

Other | 32nd Conference on Uncertainty in Artificial Intelligence 2016, UAI 2016 |
---|---|

Country | United States |

City | Jersey City |

Period | 6/25/16 → 6/29/16 |

## ASJC Scopus subject areas

- Artificial Intelligence