A proximal splitting algorithm for generalized DC programming with applications in signal recovery
The difference-of-convex (DC) program is an important model in nonconvex optimization due to its structure, which encompasses a wide range of practical applications. In this paper, we aim to tackle a generalized class of DC programs, where the objective function is formed by summing a possibly nonsm...
Saved in:
| Published in | European journal of operational research Vol. 326; no. 1; pp. 42 - 53 |
|---|---|
| Main Authors | , , , |
| Format | Journal Article |
| Language | English |
| Published |
Elsevier B.V
01.10.2025
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 0377-2217 1872-6860 |
| DOI | 10.1016/j.ejor.2025.04.034 |
Cover
| Summary: | The difference-of-convex (DC) program is an important model in nonconvex optimization due to its structure, which encompasses a wide range of practical applications. In this paper, we aim to tackle a generalized class of DC programs, where the objective function is formed by summing a possibly nonsmooth nonconvex function and a differentiable nonconvex function with Lipschitz continuous gradient, and then subtracting a nonsmooth continuous convex function. We develop a proximal splitting algorithm that utilizes proximal evaluation for the concave part and Douglas–Rachford splitting for the remaining components. The algorithm guarantees subsequential convergence to a critical point of the problem model. Under the widely used Kurdyka–Łojasiewicz property, we establish global convergence of the full sequence of iterates and derive convergence rates for both the iterates and the objective function values, without assuming the concave part is differentiable. The performance of the proposed algorithm is tested on signal recovery problems with a nonconvex regularization term and exhibits competitive results compared to notable algorithms in the literature on both synthetic data and real-world data.
•Novel proximal splitting algorithm for generalized difference-of-convex programming.•Subsequential and full sequential convergence results under mild assumptions.•Competitive performance on both synthetic and real-world datasets. |
|---|---|
| ISSN: | 0377-2217 1872-6860 |
| DOI: | 10.1016/j.ejor.2025.04.034 |