NESTT: A nonconvex primal-dual splitting method for distributed and stochastic optimization

Davood Hajinezhad, Mingyi Hong, Tuo Zhao, Zhaoran Wang

Research output: Contribution to journalConference article

17 Scopus citations

Abstract

We study a stochastic and distributed algorithm for nonconvex problems whose objective consists of a sum of N nonconvex Li/N-smooth functions, plus a non-smooth regularizer. The proposed NonconvEx primal-dual SpliTTing (NESTT) algorithm splits the problem into N subproblems, and utilizes an augmented Lagrangian based primal-dual scheme to solve it in a distributed and stochastic manner. With a special non-uniform sampling, a version of NESTT achieves ϵ-stationary solution using O((ΣN i=1 √Li/N)2/ϵ) gradient evaluations, which can be up to O(N) times better than the (proximal) gradient descent methods. It also achieves Q-linear convergence rate for nonconvex l1 penalized quadratic problems with polyhedral constraints. Further, we reveal a fundamental connection between primal-dual based methods and a few primal only methods such as IAG/SAG/SAGA.

Original languageEnglish (US)
Pages (from-to)3215-3223
Number of pages9
JournalAdvances in Neural Information Processing Systems
StatePublished - Jan 1 2016
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: Dec 5 2016Dec 10 2016

    Fingerprint

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this