Accelerated Gradient Methods via Inertial Systems with Hessian-driven Damping

We analyze the convergence rate of a family of inertial algorithms, which can be obtained by discretization of an inertial system with Hessian-driven damping. We recover a convergence rate, up to a factor of 2 speedup upon Nesterov's scheme, for smooth strongly convex functions. As a byproduct...

Full description

Saved in:
Bibliographic Details
Main Authors Wang, Zepeng, Peypouquet, Juan
Format Journal Article
LanguageEnglish
Published 24.02.2025
Subjects
Online AccessGet full text
DOI10.48550/arxiv.2502.16953

Cover

More Information
Summary:We analyze the convergence rate of a family of inertial algorithms, which can be obtained by discretization of an inertial system with Hessian-driven damping. We recover a convergence rate, up to a factor of 2 speedup upon Nesterov's scheme, for smooth strongly convex functions. As a byproduct of our analyses, we also derive linear convergence rates for convex functions satisfying quadratic growth condition or Polyak- ojasiewicz inequality. As a significant feature of our results, the dependence of the convergence rate on parameters of the inertial system/algorithm is revealed explicitly. This may help one get a better understanding of the acceleration mechanism underlying an inertial algorithm.
DOI:10.48550/arxiv.2502.16953