TY - GEN
T1 - GRAND
T2 - 58th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2022
AU - Niu, Xiaochun
AU - Wei, Ermin
N1 - Funding Information:
This work was supported in part by the National Science Foundation (NSF) under Grant ECCS-2030251 and CMMI-2024774.
Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - In this work, we study the minimax optimization problems, which model many distributed and centralized optimization problems. Existing works mainly focus on the design and analysis of specific methods, such as gradient-type methods, including gradient descent ascent method (GDA) and its variants such as extra-gradient (EG) and optimistic gradient descent ascent (OGDA) methods, and Newton-type methods. In this work, we propose GRAND as a gradient-related ascent and descent algorithmic framework for finding global minimax points. It allows updates within acute angles to the partial gradient directions. GRAND covers and motivates gradient-type, Newton-type, and other general descent ascent methods as special cases. It also enables flexible methods' designs for distributed consensus optimization problems to utilize heterogeneous agents. To the best of our knowledge, GRAND is the first generalized framework for minimax problems with convergence guarantees.
AB - In this work, we study the minimax optimization problems, which model many distributed and centralized optimization problems. Existing works mainly focus on the design and analysis of specific methods, such as gradient-type methods, including gradient descent ascent method (GDA) and its variants such as extra-gradient (EG) and optimistic gradient descent ascent (OGDA) methods, and Newton-type methods. In this work, we propose GRAND as a gradient-related ascent and descent algorithmic framework for finding global minimax points. It allows updates within acute angles to the partial gradient directions. GRAND covers and motivates gradient-type, Newton-type, and other general descent ascent methods as special cases. It also enables flexible methods' designs for distributed consensus optimization problems to utilize heterogeneous agents. To the best of our knowledge, GRAND is the first generalized framework for minimax problems with convergence guarantees.
UR - http://www.scopus.com/inward/record.url?scp=85142615333&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85142615333&partnerID=8YFLogxK
U2 - 10.1109/Allerton49937.2022.9929389
DO - 10.1109/Allerton49937.2022.9929389
M3 - Conference contribution
AN - SCOPUS:85142615333
T3 - 2022 58th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2022
BT - 2022 58th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 27 September 2022 through 30 September 2022
ER -