A proximal-gradient inertial algorithm with Tikhonov regularization: strong convergence to the minimal norm solution

We investigate the strong convergence properties of a proximal-gradient inertial algorithm with two Tikhonov regularization terms in connection with the minimization problem of the sum of a convex lower semi-continuous function f and a smooth convex function g. For the appropriate setting of the par...

Full description

Saved in:
Bibliographic Details
Published inOptimization methods & software Vol. 40; no. 4; pp. 947 - 976
Main Author László, Szilárd Csaba
Format Journal Article
LanguageEnglish
Published Taylor & Francis 04.07.2025
Subjects
Online AccessGet full text
ISSN1055-6788
1029-4937
DOI10.1080/10556788.2025.2517172

Cover

More Information
Summary:We investigate the strong convergence properties of a proximal-gradient inertial algorithm with two Tikhonov regularization terms in connection with the minimization problem of the sum of a convex lower semi-continuous function f and a smooth convex function g. For the appropriate setting of the parameters, we provide the strong convergence of the generated sequence $ (x_k){_{k\ge 0}} $ ( x k ) k ≥ 0 to the minimum norm minimizer of our objective function f + g. Further, we obtain fast convergence to zero of the objective function values in a generated sequence but also for the discrete velocity and the sub-gradient of the objective function. We also show that for another setting of the parameters the optimal rate of order $ \mathcal {O}(k^{-2}) $ O ( k − 2 ) for the potential energy $ (f+g)(x_k)-\min (f+g) $ ( f + g ) ( x k ) − min ( f + g ) can be obtained.
ISSN:1055-6788
1029-4937
DOI:10.1080/10556788.2025.2517172