Abstract
Training a classifier under non-convex constraints has gotten increasing attention in the machine learning community thanks to its wide range of applications such as algorithmic fairness and class-imbalanced clas-sification. However, several recent works addressing non-convex constraints have only focused on simple models such as logistic regression or support vector machines. Neural networks, one of the most popular models for classification nowadays, are precluded and lack theoretical guarantees. In this work, we show that overparameterized neural networks could achieve a near-optimal and near-feasible solution of non-convex constrained optimization problems via the project stochastic gradient descent. Our key ingredient is the no-regret analysis of online learning for neural networks in the overparameterization regime, which may be of independent interest in online learning applications.
Original language | English (US) |
---|---|
Pages (from-to) | 5812-5851 |
Number of pages | 40 |
Journal | Electronic Journal of Statistics |
Volume | 16 |
Issue number | 2 |
DOIs | |
State | Published - 2022 |
Keywords
- Neural tangent kernel
- non-convex constrained optimization
- online learning with non-convex losses
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty