Accelerated Gradient Methods via Inertial Systems with Hessian-driven Damping
We analyze the convergence rate of a family of inertial algorithms, which can be obtained by discretization of an inertial system with Hessian-driven damping. We recover a convergence rate, up to a factor of 2 speedup upon Nesterov's scheme, for smooth strongly convex functions. As a byproduct...
        Saved in:
      
    
          | Main Authors | , | 
|---|---|
| Format | Journal Article | 
| Language | English | 
| Published | 
          
        24.02.2025
     | 
| Subjects | |
| Online Access | Get full text | 
| DOI | 10.48550/arxiv.2502.16953 | 
Cover
| Summary: | We analyze the convergence rate of a family of inertial algorithms, which can be obtained by discretization of an inertial system with Hessian-driven damping. We recover a convergence rate, up to a factor of 2 speedup upon Nesterov's scheme, for smooth strongly convex functions. As a byproduct of our analyses, we also derive linear convergence rates for convex functions satisfying quadratic growth condition or Polyak- ojasiewicz inequality. As a significant feature of our results, the dependence of the convergence rate on parameters of the inertial system/algorithm is revealed explicitly. This may help one get a better understanding of the acceleration mechanism underlying an inertial algorithm. | 
|---|---|
| DOI: | 10.48550/arxiv.2502.16953 |