Nested distributed gradient methods with adaptive quantized communication

Albert S. Berahas, Charikleia Iakovidou, Ermin Wei

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we consider minimizing a sum of local convex objective functions in a distributed setting, where communication can be costly. We propose and analyze a class of nested distributed gradient methods with adaptive quantized communication (NEAR-DGD+Q). We show the effect of performing multiple quantized communication steps on the rate of convergence and on the size of the neighborhood of convergence, and prove R-Linear convergence to the exact solution with increasing number of consensus steps and adaptive quantization. We test the performance of the method, as well as some practical variants, on quadratic functions, and show the effects of multiple quantized communication steps in terms of iterations/gradient evaluations, communication and cost.

Original languageEnglish (US)
JournalUnknown Journal
StatePublished - Mar 18 2019

Keywords

  • Communication
  • Distributed Optimization
  • Network Optimization
  • Optimization Algorithms
  • Quantization

ASJC Scopus subject areas

  • General

Fingerprint Dive into the research topics of 'Nested distributed gradient methods with adaptive quantized communication'. Together they form a unique fingerprint.

Cite this