Stochastic incremental mirror descent algorithms with Nesterov smoothing

For minimizing a sum of finitely many proper, convex and lower semicontinuous functions over a nonempty closed convex set in an Euclidean space we propose a stochastic incremental mirror descent algorithm constructed by means of the Nesterov smoothing. Further, we modify the algorithm in order to mi...

Full description

Saved in:
Bibliographic Details
Published inNumerical algorithms Vol. 95; no. 1; pp. 351 - 382
Main Authors Bitterlich, Sandy, Grad, Sorin-Mihai
Format Journal Article
LanguageEnglish
Published New York Springer US 01.01.2024
Springer Verlag
Subjects
Online AccessGet full text
ISSN1017-1398
1572-9265
1572-9265
DOI10.1007/s11075-023-01574-1

Cover

More Information
Summary:For minimizing a sum of finitely many proper, convex and lower semicontinuous functions over a nonempty closed convex set in an Euclidean space we propose a stochastic incremental mirror descent algorithm constructed by means of the Nesterov smoothing. Further, we modify the algorithm in order to minimize over a nonempty closed convex set in an Euclidean space a sum of finitely many proper, convex and lower semicontinuous functions composed with linear operators. Next, a stochastic incremental mirror descent Bregman-proximal scheme with Nesterov smoothing is proposed in order to minimize over a nonempty closed convex set in an Euclidean space a sum of finitely many proper, convex and lower semicontinuous functions and a prox-friendly proper, convex and lower semicontinuous function. Different to the previous contributions from the literature on mirror descent methods for minimizing sums of functions, we do not require these to be (Lipschitz) continuous or differentiable. Applications in Logistics, Tomography and Machine Learning modelled as optimization problems illustrate the theoretical achievements
ISSN:1017-1398
1572-9265
1572-9265
DOI:10.1007/s11075-023-01574-1