Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization

We investigate an inertial algorithm of gradient type in connection with the minimization of a non-convex differentiable function. The algorithm is formulated in the spirit of Nesterov’s accelerated convex gradient method. We prove some abstract convergence results which applied to our numerical sch...

Full description

Saved in:
Bibliographic Details
Published inMathematical programming Vol. 190; no. 1-2; pp. 285 - 329
Main Author László, Szilárd Csaba
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.11.2021
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0025-5610
1436-4646
DOI10.1007/s10107-020-01534-w

Cover

More Information
Summary:We investigate an inertial algorithm of gradient type in connection with the minimization of a non-convex differentiable function. The algorithm is formulated in the spirit of Nesterov’s accelerated convex gradient method. We prove some abstract convergence results which applied to our numerical scheme allow us to show that the generated sequences converge to a critical point of the objective function, provided a regularization of the objective function satisfies the Kurdyka–Łojasiewicz property. Further, we obtain convergence rates for the generated sequences and the objective function values formulated in terms of the Łojasiewicz exponent of a regularization of the objective function. Finally, some numerical experiments are presented in order to compare our numerical scheme and some algorithms well known in the literature.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0025-5610
1436-4646
DOI:10.1007/s10107-020-01534-w