Distributed Gradient Tracking for Unbalanced Optimization With Different Constraint Sets

Gradient tracking methods have become popular for distributed optimization in recent years, partially because they achieve linear convergence using only a constant step-size for strongly convex optimization. In this article, we construct a counterexample on constrained optimization to show that dire...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on automatic control Vol. 68; no. 6; pp. 3633 - 3640
Main Authors Cheng, Songsong, Liang, Shu, Fan, Yuan, Hong, Yiguang
Format Journal Article
LanguageEnglish
Published New York IEEE 01.06.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0018-9286
1558-2523
DOI10.1109/TAC.2022.3192316

Cover

More Information
Summary:Gradient tracking methods have become popular for distributed optimization in recent years, partially because they achieve linear convergence using only a constant step-size for strongly convex optimization. In this article, we construct a counterexample on constrained optimization to show that direct extension of gradient tracking by using projections cannot guarantee the correctness. Then, we propose projected gradient tracking algorithms with diminishing step-sizes rather than a constant one for distributed strongly convex optimization with different constraint sets and unbalanced graphs. Our basic algorithm can achieve <inline-formula><tex-math notation="LaTeX">O(\ln T/{T})</tex-math></inline-formula> convergence rate. Moreover, we design an epoch iteration scheme and improve the convergence rate as <inline-formula><tex-math notation="LaTeX">O(1/{T})</tex-math></inline-formula>.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9286
1558-2523
DOI:10.1109/TAC.2022.3192316