Abstract
We consider minimization of stochastic functionals that are compositions of a (potentially) nonsmooth convex function h and smooth function c and, more generally, stochastic weakly convex functionals. We develop a family of stochastic methods—including a stochastic prox-linear algorithm and a stochastic (generalized) subgradient procedure—and prove that, under mild technical conditions, each converges to first order stationary points of the stochastic objective. We provide experiments further investigating our methods on nonsmooth phase retrieval problems; the experiments indicate the practical effectiveness of the procedures.
Original language | English (US) |
---|---|
Pages (from-to) | 3229-3259 |
Number of pages | 31 |
Journal | SIAM Journal on Optimization |
Volume | 28 |
Issue number | 4 |
DOIs | |
State | Published - 2018 |
Funding
The author’s work was partially supported by NSF award CCF-1553086 and an Alfred P. Sloan fellowship. The work of the second author was supported by an E.K. Potter Stanford Graduate Fellowship.
Keywords
- Composite optimization
- Differential inclusion
- Stochastic optimization
ASJC Scopus subject areas
- Software
- Theoretical Computer Science
- Applied Mathematics