A recent proximal gradient algorithm for convex minimization problem using double inertial extrapolations
In this study, we suggest a new class of forward-backward (FB) algorithms designed to solve convex minimization problems. Our method incorporates a linesearch technique, eliminating the need to choose Lipschitz assumptions explicitly. Additionally, we apply double inertial extrapolations to enhance...
Saved in:
| Published in | AIMS mathematics Vol. 9; no. 7; pp. 18841 - 18859 |
|---|---|
| Main Authors | , , , |
| Format | Journal Article |
| Language | English |
| Published |
AIMS Press
01.01.2024
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2473-6988 2473-6988 |
| DOI | 10.3934/math.2024917 |
Cover
| Summary: | In this study, we suggest a new class of forward-backward (FB) algorithms designed to solve convex minimization problems. Our method incorporates a linesearch technique, eliminating the need to choose Lipschitz assumptions explicitly. Additionally, we apply double inertial extrapolations to enhance the algorithm's convergence rate. We establish a weak convergence theorem under some mild conditions. Furthermore, we perform numerical tests, and apply the algorithm to image restoration and data classification as a practical application. The experimental results show our approach's superior performance and effectiveness, surpassing some existing methods in the literature. |
|---|---|
| ISSN: | 2473-6988 2473-6988 |
| DOI: | 10.3934/math.2024917 |