Nested Distributed Gradient Methods with Stochastic Computation Errors

Charikleia Iakovidou, Ermin Wei

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

In this work, we consider the problem of a network of agents collectively minimizing a sum of convex functions. The agents in our setting can only access their local objective functions and exchange information with their immediate neighbors. Motivated by applications where computation is imperfect, including, but not limited to, empirical risk minimization (ERM) and online learning, we assume that only noisy estimates of the local gradients are available. To tackle this problem, we adapt a class of Nested Distributed Gradient methods (NEAR-DGD) to the stochastic gradient setting. These methods have minimal storage requirements, are communication aware and perform well in settings where gradient computation is costly, while communication is relatively inexpensive. We investigate the convergence properties of our method under standard assumptions for stochastic gradients, i.e. unbiasedness and bounded variance. Our analysis indicates that our method converges to a neighborhood of the optimal solution with a linear rate for local strongly convex functions and appropriate constant steplengths. We also show that distributed optimization with stochastic gradients achieves a noise reduction effect similar to mini-batching, which scales favorably with network size. Finally, we present numerical results to demonstrate the effectiveness of our method.

Original languageEnglish (US)
Title of host publication2019 57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages339-346
Number of pages8
ISBN (Electronic)9781728131511
DOIs
StatePublished - Sep 2019
Event57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019 - Monticello, United States
Duration: Sep 24 2019Sep 27 2019

Publication series

Name2019 57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019

Conference

Conference57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019
Country/TerritoryUnited States
CityMonticello
Period9/24/199/27/19

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Hardware and Architecture
  • Safety, Risk, Reliability and Quality
  • Control and Optimization

Fingerprint

Dive into the research topics of 'Nested Distributed Gradient Methods with Stochastic Computation Errors'. Together they form a unique fingerprint.

Cite this