Nested Distributed Gradient Methods with Adaptive Quantized Communication

Albert S. Berahas, Charikleia Iakovidou, Ermin Wei

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Scopus citations


In this paper, we consider minimizing a sum of local convex objective functions in a distributed setting, where communication can be costly. We propose and analyze a class of nested distributed gradient methods with adaptive quantized communication (NEAR-DGD+Q). We show the effect of performing multiple quantized communication steps on the rate of convergence and on the size of the neighborhood of convergence, and prove R-Linear convergence to the exact solution with increasing number of consensus steps and adaptive quantization. We test the performance of the method, as well as some practical variants, on quadratic functions, and show the effects of multiple quantized communication steps in terms of iterations/gradient evaluations, communication and cost.

Original languageEnglish (US)
Title of host publication2019 IEEE 58th Conference on Decision and Control, CDC 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages7
ISBN (Electronic)9781728113982
StatePublished - Dec 2019
Event58th IEEE Conference on Decision and Control, CDC 2019 - Nice, France
Duration: Dec 11 2019Dec 13 2019

Publication series

NameProceedings of the IEEE Conference on Decision and Control
ISSN (Print)0743-1546
ISSN (Electronic)2576-2370


Conference58th IEEE Conference on Decision and Control, CDC 2019


  • Communication
  • Distributed Optimization
  • Network Optimization
  • Optimization Algorithms
  • Quantization

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Modeling and Simulation
  • Control and Optimization


Dive into the research topics of 'Nested Distributed Gradient Methods with Adaptive Quantized Communication'. Together they form a unique fingerprint.

Cite this