Provably training overparameterized neural network classifiers with non-convex constraints

You Lin Chen, Zhaoran Wang, Mladen Kolar

Research output: Contribution to journalArticlepeer-review


Training a classifier under non-convex constraints has gotten increasing attention in the machine learning community thanks to its wide range of applications such as algorithmic fairness and class-imbalanced clas-sification. However, several recent works addressing non-convex constraints have only focused on simple models such as logistic regression or support vector machines. Neural networks, one of the most popular models for classification nowadays, are precluded and lack theoretical guarantees. In this work, we show that overparameterized neural networks could achieve a near-optimal and near-feasible solution of non-convex constrained optimization problems via the project stochastic gradient descent. Our key ingredient is the no-regret analysis of online learning for neural networks in the overparameterization regime, which may be of independent interest in online learning applications.

Original languageEnglish (US)
Pages (from-to)5812-5851
Number of pages40
JournalElectronic Journal of Statistics
Issue number2
StatePublished - 2022


  • Neural tangent kernel
  • non-convex constrained optimization
  • online learning with non-convex losses

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Provably training overparameterized neural network classifiers with non-convex constraints'. Together they form a unique fingerprint.

Cite this