Inexact SARAH algorithm for stochastic optimization

We develop and analyse a variant of the SARAH algorithm, which does not require computation of the exact gradient. Thus this new method can be applied to general expectation minimization problems rather than only finite sum problems. While the original SARAH algorithm, as well as its predecessor, SV...

Full description

Saved in:
Bibliographic Details
Published inOptimization methods & software Vol. 36; no. 1; pp. 237 - 258
Main Authors Nguyen, Lam M., Scheinberg, Katya, Takáč, Martin
Format Journal Article
LanguageEnglish
Published Abingdon Taylor & Francis 02.01.2021
Taylor & Francis Ltd
Subjects
Online AccessGet full text
ISSN1055-6788
1029-4937
DOI10.1080/10556788.2020.1818081

Cover

More Information
Summary:We develop and analyse a variant of the SARAH algorithm, which does not require computation of the exact gradient. Thus this new method can be applied to general expectation minimization problems rather than only finite sum problems. While the original SARAH algorithm, as well as its predecessor, SVRG, requires an exact gradient computation on each outer iteration, the inexact variant of SARAH (iSARAH), which we develop here, requires only stochastic gradient computed on a mini-batch of sufficient size. The proposed method combines variance reduction via sample size selection and iterative stochastic gradient updates. We analyse the convergence rate of the algorithms for strongly convex and non-strongly convex cases, under smooth assumption with appropriate mini-batch size selected for each case. We show that with an additional, reasonable, assumption iSARAH achieves the best-known complexity among stochastic methods in the case of non-strongly convex stochastic functions.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1055-6788
1029-4937
DOI:10.1080/10556788.2020.1818081