FlexPD: A Flexible Framework of First-Order Primal-Dual Algorithms for Distributed Optimization

Fatemeh Mansoori, Ermin Wei

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we study the problem of minimizing a sum of convex objective functions, which are locally available to agents in a network. Distributed optimization algorithms make it possible for the agents to cooperatively solve the problem through local computations and communications with neighbors. Lagrangian-based distributed optimization algorithms have received significant attention in recent years, due to their exact convergence property. However, many of these algorithms have slow convergence or are expensive to execute. In this paper, we propose a flexible framework of first-order primal-dual algorithms (FlexPD), which allows for an arbitrary number of primal steps per iteration. This framework includes three algorithms, FlexPD-F, FlexPD-G, and FlexPD-C that can be customized for various applications with different computation and communication requirements. For strongly convex and Lipschitz gradient objective functions, by carefully controlling the stepsize choices, we establish linear convergence of our proposed framework to the optimal solution. Simulation results confirm the superior performance of our framework compared to the existing methods.

Original languageEnglish (US)
JournalIEEE Transactions on Signal Processing
DOIs
StateAccepted/In press - 2021

Keywords

  • Convergence
  • Convex functions
  • Linear programming
  • Minimization
  • Optimization
  • Signal processing algorithms
  • Symmetric matrices

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'FlexPD: A Flexible Framework of First-Order Primal-Dual Algorithms for Distributed Optimization'. Together they form a unique fingerprint.

Cite this