A new inexact stochastic recursive gradient descent algorithm with Barzilai–Borwein step size in machine learning

The inexact SARAH (iSARAH) algorithm as a variant of SARAH algorithm for variance reduction has recently surged into prominence for solving large-scale optimization problems in the context of machine learning. The performance of the iSARAH significantly depends on the choice of step-size sequence. I...

Full description

Saved in:
Bibliographic Details
Published inNonlinear dynamics Vol. 111; no. 4; pp. 3575 - 3586
Main Authors Yang, Yi-ming, Wang, Fu-sheng, Li, Jin-xiang, Qin, Yuan-yuan
Format Journal Article
LanguageEnglish
Published Dordrecht Springer Netherlands 01.02.2023
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0924-090X
1573-269X
DOI10.1007/s11071-022-07987-2

Cover

More Information
Summary:The inexact SARAH (iSARAH) algorithm as a variant of SARAH algorithm for variance reduction has recently surged into prominence for solving large-scale optimization problems in the context of machine learning. The performance of the iSARAH significantly depends on the choice of step-size sequence. In this paper, we develop a new algorithm called iSARAH-BB, which employs the Barzilai–Borwein (BB) method to automatically compute step size based on SARAH. By introducing this adaptive step size in the design of the new algorithm, iSARAH-BB can take better advantages of both iSARAH and BB methods. Finally, we analyze the convergence rate and the complexity of the new algorithm under the usual assumptions. Numerical experiments on standard datasets indicate that our proposed iSARAH-BB algorithm is robust to the selection of the initial step size, and it is effective and more competitive than the existing algorithms.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0924-090X
1573-269X
DOI:10.1007/s11071-022-07987-2