Variance reduced moving balls approximation method for smooth constrained minimization problems

In this paper, we consider the problem of minimizing the sum of a large number of smooth convex functions subject to a complicated constraint set defined by a smooth convex function. Such a problem has wide applications in many areas, such as machine learning and signal processing. By utilizing vari...

Full description

Saved in:
Bibliographic Details
Published inOptimization letters Vol. 18; no. 5; pp. 1253 - 1271
Main Authors Yang, Zhichun, Xia, Fu-quan, Tu, Kai
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.06.2024
Subjects
Online AccessGet full text
ISSN1862-4472
1862-4480
DOI10.1007/s11590-023-02049-x

Cover

Abstract In this paper, we consider the problem of minimizing the sum of a large number of smooth convex functions subject to a complicated constraint set defined by a smooth convex function. Such a problem has wide applications in many areas, such as machine learning and signal processing. By utilizing variance reduction and moving balls approximation techniques, we propose a new variance reduced moving balls approximation method. Compared with existing convergence rates of moving balls approximation-type methods that require the strong convexity of the objective function, a notable advantage of the proposed method is that the linear and sublinear convergence rates can be guaranteed under the quadratic gradient growth property and convexity condition, respectively. To demonstrate its effectiveness, numerical experiments for solving the smooth regularized logistic regression problem and the Neyman-Pearson classification problem are presented.
AbstractList In this paper, we consider the problem of minimizing the sum of a large number of smooth convex functions subject to a complicated constraint set defined by a smooth convex function. Such a problem has wide applications in many areas, such as machine learning and signal processing. By utilizing variance reduction and moving balls approximation techniques, we propose a new variance reduced moving balls approximation method. Compared with existing convergence rates of moving balls approximation-type methods that require the strong convexity of the objective function, a notable advantage of the proposed method is that the linear and sublinear convergence rates can be guaranteed under the quadratic gradient growth property and convexity condition, respectively. To demonstrate its effectiveness, numerical experiments for solving the smooth regularized logistic regression problem and the Neyman-Pearson classification problem are presented.
Author Tu, Kai
Yang, Zhichun
Xia, Fu-quan
Author_xml – sequence: 1
  givenname: Zhichun
  surname: Yang
  fullname: Yang, Zhichun
  organization: School of Mathematical Science, Sichuan Normal University
– sequence: 2
  givenname: Fu-quan
  surname: Xia
  fullname: Xia, Fu-quan
  organization: School of Mathematical Science, Sichuan Normal University
– sequence: 3
  givenname: Kai
  orcidid: 0000-0002-0557-6792
  surname: Tu
  fullname: Tu, Kai
  email: kaitu_02@163.com
  organization: College of Mathematics and Statistics, Shenzhen University, School of Mathematical Science, Laurent Mathematics Center, Sichuan Normal University
BookMark eNp9kMtOwzAQRS1UJNrCD7DyDwT8StIsUcVLqsQG2FqOPW5dJXZlpyjw9bgEsWQxmlncM7o6CzTzwQNC15TcUELq20Rp2ZCCMJ6HiKYYz9CcripWCLEis7-7ZhdokdKekIrSppkj-a6iU14DjmCOGgzuw4fzW9yqrktYHQ4xjK5Xgwse9zDsgsE2RJz6EIYd1sGnISrnT6DzrndfUzRjbQd9ukTnVnUJrn73Er093L-un4rNy-Pz-m5TaCbYULBW2KpdldYCt0C5aYgRoIFpVlUE2sZUZc1rzbVqaqBaaKOoNrRswYAuCV8iNv3VMaQUwcpDzLXjp6REnhTJSZHMiuSPIjlmiE9QymG_hSj34Rh97vkf9Q1EW3Av
Cites_doi 10.1002/wics.1376
10.1287/moor.2015.0735
10.1080/10618600.2018.1473777
10.1007/s10107-004-0552-5
10.1007/s12532-021-00214-w
10.1137/1.9781611974997
10.1007/s10107-017-1206-8
10.1137/090763317
10.1137/16M1080173
10.1137/20M1314057
10.1214/aoms/1177729586
10.1007/s11590-014-0795-x
10.1007/s10107-019-01425-9
10.1007/s11590-019-01520-y
10.1137/070704277
10.1007/978-3-319-91578-4
10.1007/978-1-4757-4296-1
10.1109/ICASSP.2017.7952918
10.4208/jcm.1912-m2016-0634
10.1007/s10107-018-1232-1
10.1007/s11590-020-01550-x
ContentType Journal Article
Copyright The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
Copyright_xml – notice: The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
DBID AAYXX
CITATION
DOI 10.1007/s11590-023-02049-x
DatabaseName CrossRef
DatabaseTitle CrossRef
DatabaseTitleList
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Mathematics
EISSN 1862-4480
EndPage 1271
ExternalDocumentID 10_1007_s11590_023_02049_x
GrantInformation_xml – fundername: Innovative Research Group Project of the National Natural Science Foundation of China
  grantid: No. 12101436
  funderid: http://dx.doi.org/10.13039/100014718
GroupedDBID -5D
-5G
-BR
-EM
-Y2
-~C
.VR
06D
0R~
0VY
123
1N0
203
2J2
2JN
2JY
2KG
2KM
2LR
2VQ
2~H
30V
4.4
406
408
409
40D
40E
5VS
67Z
6NX
8TC
8UJ
95-
95.
95~
96X
AAAVM
AABHQ
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAYIU
AAYQN
AAYTO
AAYZH
ABAKF
ABBXA
ABDZT
ABECU
ABFTV
ABHQN
ABJNI
ABJOX
ABKCH
ABMNI
ABMQK
ABNWP
ABQBU
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABULA
ABWNU
ABXPI
ACAOD
ACBXY
ACDTI
ACGFS
ACHSB
ACHXU
ACKNC
ACMDZ
ACMLO
ACOKC
ACOMO
ACPIV
ACSNA
ACZOJ
ADHHG
ADHIR
ADINQ
ADKNI
ADKPE
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEMSY
AENEX
AEOHA
AEPYU
AESKC
AETLH
AEVLU
AEXYK
AFBBN
AFGCZ
AFLOW
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWIL
AGWZB
AGYKE
AHAVH
AHBYD
AHKAY
AHSBF
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
ALMA_UNASSIGNED_HOLDINGS
ALWAN
AMKLP
AMXSW
AMYLF
AMYQR
AOCGG
ARMRJ
AXYYD
AYJHY
B-.
BA0
BAPOH
BDATZ
BGNMA
BSONS
CAG
COF
CS3
CSCUP
DDRTE
DNIVK
DPUIP
DU5
EBLON
EBS
EIOEI
EJD
ESBYG
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRRFC
FSGXE
FWDCC
GGCAI
GGRSB
GJIRD
GNWQR
GQ6
GQ7
GQ8
GXS
H13
HF~
HG5
HG6
HLICF
HMJXF
HQYDN
HRMNR
HZ~
IJ-
IKXTQ
IWAJR
IXC
IXD
IXE
IZIGR
IZQ
I~X
I~Z
J-C
J0Z
J9A
JBSCW
JCJTX
JZLTJ
KDC
KOV
LLZTM
M4Y
MA-
N9A
NPVJJ
NQJWS
NU0
O9-
O93
O9J
OAM
P2P
P9M
PF0
PT4
QOS
R89
R9I
ROL
RPX
RSV
S16
S1Z
S27
S3B
SAP
SDH
SHX
SISQX
SJYHP
SMT
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
STPWE
SZN
T13
TSG
TSK
TSV
TUC
U2A
UG4
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W23
W48
WK8
YLTOR
Z45
Z7R
Z7Y
Z83
Z88
ZMTXR
~A9
AAPKM
AAYXX
ABBRH
ABDBE
ABFSG
ABRTQ
ACSTC
AEZWR
AFDZB
AFHIU
AFOHR
AHPBZ
AHWEU
AIXLP
ATHPR
AYFIA
CITATION
ID FETCH-LOGICAL-c242t-2b4f6b85ffe3fe13d90d4ece2c2660eb9d65737c3ca97e1c4cda1cd15bedec503
IEDL.DBID AGYKE
ISSN 1862-4472
IngestDate Wed Oct 01 02:12:05 EDT 2025
Fri Feb 21 02:40:24 EST 2025
IsPeerReviewed true
IsScholarly true
Issue 5
Keywords Convergence rate
Smooth constrained minimization
Moving balls approximation
Relaxed strong convexity condition
Variance reduction
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c242t-2b4f6b85ffe3fe13d90d4ece2c2660eb9d65737c3ca97e1c4cda1cd15bedec503
ORCID 0000-0002-0557-6792
PageCount 19
ParticipantIDs crossref_primary_10_1007_s11590_023_02049_x
springer_journals_10_1007_s11590_023_02049_x
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 20240600
2024-06-00
PublicationDateYYYYMMDD 2024-06-01
PublicationDate_xml – month: 6
  year: 2024
  text: 20240600
PublicationDecade 2020
PublicationPlace Berlin/Heidelberg
PublicationPlace_xml – name: Berlin/Heidelberg
PublicationTitle Optimization letters
PublicationTitleAbbrev Optim Lett
PublicationYear 2024
Publisher Springer Berlin Heidelberg
Publisher_xml – name: Springer Berlin Heidelberg
References BottouLCurtisFENocedalJOptimization methods for large-scale machine learningSIAM Rev.2018602223311379771910.1137/16M1080173
ZhangHChengLRestricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimizationOptim. Lett.20159961979334859410.1007/s11590-014-0795-x
Wu, S.X., Yue, M.-C., So, A.M.-C., Ma, W.-K.: SDR approximation bounds for the robust multicast beamforming problem with interference temperature constraints. In: ICASSP, pp. 4054–4058 (2017). IEEE
FangCLiCJLinZZhangTSPIDER: Near-optimal non-convex optimization via stochastic path-integrated differential estimatorAdv. Neural Inf. Process. Syst.201831687697
GainesBRKimJZhouHAlgorithms for fitting the constrained LassoJ. Comput. Graph. Stat.2018274861871389087610.1080/10618600.2018.1473777
AdachiSNakatsukasaYEigenvalue-based algorithm and analysis for nonconvex QCQP with one constraintMath. Program.201917379116390445810.1007/s10107-017-1206-8
LiuYWangXGuoTA linearly convergent stochastic recursive gradient method for convex optimizationOptim. Lett.20201422652283416361210.1007/s11590-020-01550-x
DefazioABachFLacoste-JulienSSAGA: a fast incremental gradient method with support for non-strongly convex composite objectivesAdv. Neural Inf. Process. Syst.20142716461654
YuPPongTKLuZConvergence rate analysis of a sequential convex programming method with line search for a class of constrained difference-of-convex optimization problemsSIAM J. Optim.202131320242054429676510.1137/20M1314057
ZhangLWA stochastic moving balls approximation method over a smooth inequality constraintJ. Comput. Math.2020383528546408780010.4208/jcm.1912-m2016-0634
NemirovskiAJuditskyALanGShapiroARobust stochastic approximation approach to stochastic programmingSIAM J. Optim.200919415741609248604110.1137/070704277
NesterovYLectures on Convex Optimization2018New YorkSpringer10.1007/978-3-319-91578-4
ShreveSEStochastic Calculus for Finance II: Continuous-Time Models2004New YorkSpringer10.1007/978-1-4757-4296-1
JohnsonRZhangTAccelerating stochastic gradient descent using predictive variance reductionAdv. Neural Inf. Process. Syst.201326315323
NecoaraINesterovYGlineurFLinear convergence of first order methods for non-strongly convex optimizationMath. Program.201917569107394288610.1007/s10107-018-1232-1
ParkYRyuEKLinear convergence of cyclic SAGAOptim. Lett.20201415831598413057510.1007/s11590-019-01520-y
Lee, S.-I., Lee, H., Abbeel, P., Ng, A.: Efficient L1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$L_1$$\end{document} regularized logistic regression. In: Proc. AAAI, pp. 401–408 (2006)
YanYXuYAdaptive primal-dual stochastic gradient method for expectation-constrained convex stochastic programsMath. Program. Comput.202214319363442190610.1007/s12532-021-00214-w
Dua, D., Graff, C.: UCI Machine Learning Repository (2017). http://archive.ics.uci.edu/ml
Le RouxNSchmidtMWBachFRA stochastic gradient method with an exponential convergence rate for finite training setsAdv. Neural Inf. Process. Syst.20132526632671
BolteJPauwelsEMajorization-minimization procedures and convergence of SQP methods for semi-algebraic and tame programsMath. Oper. Res.2016412442465348680310.1287/moor.2015.0735
NesterovYSmooth minimization of non-smooth functionsMath. Program.2005103127152216653710.1007/s10107-004-0552-5
GuyonIGunnSRBen-HurADrorGResult analysis of the NIPS 2003 feature selection challengeAdv. Neural Inf. Process. Syst.200517545552
XuYIteration complexity of inexact augmented Lagrangian methods for constrained convex programmingMath. Program.2021185199244420171210.1007/s10107-019-01425-9
AuslenderAShefiRTeboulleMA moving balls approximation method for a class of smooth constrained minimization problemsSIAM J. Optim.201020632323259273595210.1137/090763317
BeckAFirst-Order Methods in Optimization2017PhiladephiaSpringer10.1137/1.9781611974997
RigolletPTongXNeyman-pearson classification, convexity and stochastic constraintsJ. Mach. Learn. Res.201112283128552854349
RobbinsHMonroSA stochastic approximation methodAnn. Math. Stat.1951224004074266810.1214/aoms/1177729586
TongXFengYZhaoAA survey on Neyman-Pearson classification and suggestions for future researchWiley Interdiscip. Rev. Comput. Stat.201686481346599910.1002/wics.1376
Nguyen, L.M., Liu, J., Scheinberg, K., Takáčg, M.: SARAH: a novel method for machine learning problems using stochastic recursive gradient. In: Proceedings of the 34th ICML, pp. 2613–2621 (2017)
Grant, M., Boyd, S., Ye, Y.: CVX: Matlab software for disciplined convex programming (2008)
S Adachi (2049_CR10) 2019; 173
SE Shreve (2049_CR23) 2004
2049_CR28
LW Zhang (2049_CR3) 2020; 38
Y Nesterov (2049_CR25) 2005; 103
Y Liu (2049_CR8) 2020; 14
2049_CR24
2049_CR1
2049_CR26
BR Gaines (2049_CR2) 2018; 27
A Nemirovski (2049_CR13) 2009; 19
A Auslender (2049_CR19) 2010; 20
Y Yan (2049_CR30) 2022; 14
Y Park (2049_CR9) 2020; 14
I Guyon (2049_CR29) 2005; 17
A Defazio (2049_CR15) 2014; 27
N Le Roux (2049_CR14) 2013; 25
P Yu (2049_CR21) 2021; 31
I Necoara (2049_CR7) 2019; 175
X Tong (2049_CR27) 2016; 8
2049_CR17
A Beck (2049_CR5) 2017
R Johnson (2049_CR16) 2013; 26
H Robbins (2049_CR11) 1951; 22
L Bottou (2049_CR12) 2018; 60
Y Xu (2049_CR31) 2021; 185
C Fang (2049_CR18) 2018; 31
P Rigollet (2049_CR4) 2011; 12
H Zhang (2049_CR22) 2015; 9
Y Nesterov (2049_CR6) 2018
J Bolte (2049_CR20) 2016; 41
References_xml – reference: ShreveSEStochastic Calculus for Finance II: Continuous-Time Models2004New YorkSpringer10.1007/978-1-4757-4296-1
– reference: ParkYRyuEKLinear convergence of cyclic SAGAOptim. Lett.20201415831598413057510.1007/s11590-019-01520-y
– reference: NesterovYLectures on Convex Optimization2018New YorkSpringer10.1007/978-3-319-91578-4
– reference: Wu, S.X., Yue, M.-C., So, A.M.-C., Ma, W.-K.: SDR approximation bounds for the robust multicast beamforming problem with interference temperature constraints. In: ICASSP, pp. 4054–4058 (2017). IEEE
– reference: Dua, D., Graff, C.: UCI Machine Learning Repository (2017). http://archive.ics.uci.edu/ml
– reference: RigolletPTongXNeyman-pearson classification, convexity and stochastic constraintsJ. Mach. Learn. Res.201112283128552854349
– reference: ZhangLWA stochastic moving balls approximation method over a smooth inequality constraintJ. Comput. Math.2020383528546408780010.4208/jcm.1912-m2016-0634
– reference: Lee, S.-I., Lee, H., Abbeel, P., Ng, A.: Efficient L1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$L_1$$\end{document} regularized logistic regression. In: Proc. AAAI, pp. 401–408 (2006)
– reference: BeckAFirst-Order Methods in Optimization2017PhiladephiaSpringer10.1137/1.9781611974997
– reference: Nguyen, L.M., Liu, J., Scheinberg, K., Takáčg, M.: SARAH: a novel method for machine learning problems using stochastic recursive gradient. In: Proceedings of the 34th ICML, pp. 2613–2621 (2017)
– reference: AuslenderAShefiRTeboulleMA moving balls approximation method for a class of smooth constrained minimization problemsSIAM J. Optim.201020632323259273595210.1137/090763317
– reference: AdachiSNakatsukasaYEigenvalue-based algorithm and analysis for nonconvex QCQP with one constraintMath. Program.201917379116390445810.1007/s10107-017-1206-8
– reference: JohnsonRZhangTAccelerating stochastic gradient descent using predictive variance reductionAdv. Neural Inf. Process. Syst.201326315323
– reference: NecoaraINesterovYGlineurFLinear convergence of first order methods for non-strongly convex optimizationMath. Program.201917569107394288610.1007/s10107-018-1232-1
– reference: LiuYWangXGuoTA linearly convergent stochastic recursive gradient method for convex optimizationOptim. Lett.20201422652283416361210.1007/s11590-020-01550-x
– reference: NesterovYSmooth minimization of non-smooth functionsMath. Program.2005103127152216653710.1007/s10107-004-0552-5
– reference: GainesBRKimJZhouHAlgorithms for fitting the constrained LassoJ. Comput. Graph. Stat.2018274861871389087610.1080/10618600.2018.1473777
– reference: NemirovskiAJuditskyALanGShapiroARobust stochastic approximation approach to stochastic programmingSIAM J. Optim.200919415741609248604110.1137/070704277
– reference: YuPPongTKLuZConvergence rate analysis of a sequential convex programming method with line search for a class of constrained difference-of-convex optimization problemsSIAM J. Optim.202131320242054429676510.1137/20M1314057
– reference: XuYIteration complexity of inexact augmented Lagrangian methods for constrained convex programmingMath. Program.2021185199244420171210.1007/s10107-019-01425-9
– reference: FangCLiCJLinZZhangTSPIDER: Near-optimal non-convex optimization via stochastic path-integrated differential estimatorAdv. Neural Inf. Process. Syst.201831687697
– reference: DefazioABachFLacoste-JulienSSAGA: a fast incremental gradient method with support for non-strongly convex composite objectivesAdv. Neural Inf. Process. Syst.20142716461654
– reference: Le RouxNSchmidtMWBachFRA stochastic gradient method with an exponential convergence rate for finite training setsAdv. Neural Inf. Process. Syst.20132526632671
– reference: BottouLCurtisFENocedalJOptimization methods for large-scale machine learningSIAM Rev.2018602223311379771910.1137/16M1080173
– reference: TongXFengYZhaoAA survey on Neyman-Pearson classification and suggestions for future researchWiley Interdiscip. Rev. Comput. Stat.201686481346599910.1002/wics.1376
– reference: YanYXuYAdaptive primal-dual stochastic gradient method for expectation-constrained convex stochastic programsMath. Program. Comput.202214319363442190610.1007/s12532-021-00214-w
– reference: BolteJPauwelsEMajorization-minimization procedures and convergence of SQP methods for semi-algebraic and tame programsMath. Oper. Res.2016412442465348680310.1287/moor.2015.0735
– reference: RobbinsHMonroSA stochastic approximation methodAnn. Math. Stat.1951224004074266810.1214/aoms/1177729586
– reference: Grant, M., Boyd, S., Ye, Y.: CVX: Matlab software for disciplined convex programming (2008)
– reference: ZhangHChengLRestricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimizationOptim. Lett.20159961979334859410.1007/s11590-014-0795-x
– reference: GuyonIGunnSRBen-HurADrorGResult analysis of the NIPS 2003 feature selection challengeAdv. Neural Inf. Process. Syst.200517545552
– volume: 26
  start-page: 315
  year: 2013
  ident: 2049_CR16
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 8
  start-page: 64
  year: 2016
  ident: 2049_CR27
  publication-title: Wiley Interdiscip. Rev. Comput. Stat.
  doi: 10.1002/wics.1376
– volume: 41
  start-page: 442
  issue: 2
  year: 2016
  ident: 2049_CR20
  publication-title: Math. Oper. Res.
  doi: 10.1287/moor.2015.0735
– volume: 27
  start-page: 861
  issue: 4
  year: 2018
  ident: 2049_CR2
  publication-title: J. Comput. Graph. Stat.
  doi: 10.1080/10618600.2018.1473777
– volume: 12
  start-page: 2831
  year: 2011
  ident: 2049_CR4
  publication-title: J. Mach. Learn. Res.
– volume: 103
  start-page: 127
  year: 2005
  ident: 2049_CR25
  publication-title: Math. Program.
  doi: 10.1007/s10107-004-0552-5
– ident: 2049_CR17
– volume: 14
  start-page: 319
  year: 2022
  ident: 2049_CR30
  publication-title: Math. Program. Comput.
  doi: 10.1007/s12532-021-00214-w
– volume-title: First-Order Methods in Optimization
  year: 2017
  ident: 2049_CR5
  doi: 10.1137/1.9781611974997
– volume: 17
  start-page: 545
  year: 2005
  ident: 2049_CR29
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 27
  start-page: 1646
  year: 2014
  ident: 2049_CR15
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 173
  start-page: 79
  year: 2019
  ident: 2049_CR10
  publication-title: Math. Program.
  doi: 10.1007/s10107-017-1206-8
– volume: 20
  start-page: 3232
  issue: 6
  year: 2010
  ident: 2049_CR19
  publication-title: SIAM J. Optim.
  doi: 10.1137/090763317
– volume: 60
  start-page: 223
  issue: 2
  year: 2018
  ident: 2049_CR12
  publication-title: SIAM Rev.
  doi: 10.1137/16M1080173
– volume: 31
  start-page: 2024
  issue: 3
  year: 2021
  ident: 2049_CR21
  publication-title: SIAM J. Optim.
  doi: 10.1137/20M1314057
– volume: 22
  start-page: 400
  year: 1951
  ident: 2049_CR11
  publication-title: Ann. Math. Stat.
  doi: 10.1214/aoms/1177729586
– volume: 9
  start-page: 961
  year: 2015
  ident: 2049_CR22
  publication-title: Optim. Lett.
  doi: 10.1007/s11590-014-0795-x
– volume: 185
  start-page: 199
  year: 2021
  ident: 2049_CR31
  publication-title: Math. Program.
  doi: 10.1007/s10107-019-01425-9
– volume: 25
  start-page: 2663
  year: 2013
  ident: 2049_CR14
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 14
  start-page: 1583
  year: 2020
  ident: 2049_CR9
  publication-title: Optim. Lett.
  doi: 10.1007/s11590-019-01520-y
– volume: 19
  start-page: 1574
  issue: 4
  year: 2009
  ident: 2049_CR13
  publication-title: SIAM J. Optim.
  doi: 10.1137/070704277
– volume: 31
  start-page: 687
  year: 2018
  ident: 2049_CR18
  publication-title: Adv. Neural Inf. Process. Syst.
– volume-title: Lectures on Convex Optimization
  year: 2018
  ident: 2049_CR6
  doi: 10.1007/978-3-319-91578-4
– volume-title: Stochastic Calculus for Finance II: Continuous-Time Models
  year: 2004
  ident: 2049_CR23
  doi: 10.1007/978-1-4757-4296-1
– ident: 2049_CR1
  doi: 10.1109/ICASSP.2017.7952918
– ident: 2049_CR28
– volume: 38
  start-page: 528
  issue: 3
  year: 2020
  ident: 2049_CR3
  publication-title: J. Comput. Math.
  doi: 10.4208/jcm.1912-m2016-0634
– volume: 175
  start-page: 69
  year: 2019
  ident: 2049_CR7
  publication-title: Math. Program.
  doi: 10.1007/s10107-018-1232-1
– ident: 2049_CR26
– volume: 14
  start-page: 2265
  year: 2020
  ident: 2049_CR8
  publication-title: Optim. Lett.
  doi: 10.1007/s11590-020-01550-x
– ident: 2049_CR24
SSID ssj0061199
Score 2.3209834
Snippet In this paper, we consider the problem of minimizing the sum of a large number of smooth convex functions subject to a complicated constraint set defined by a...
SourceID crossref
springer
SourceType Index Database
Publisher
StartPage 1253
SubjectTerms Computational Intelligence
Mathematics
Mathematics and Statistics
Numerical and Computational Physics
Operations Research/Decision Theory
Optimization
Original Paper
Simulation
Title Variance reduced moving balls approximation method for smooth constrained minimization problems
URI https://link.springer.com/article/10.1007/s11590-023-02049-x
Volume 18
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVLSH
  databaseName: SpringerLink Journals
  customDbUrl:
  mediaType: online
  eissn: 1862-4480
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0061199
  issn: 1862-4472
  databaseCode: AFBBN
  dateStart: 20070101
  isFulltext: true
  providerName: Library Specific Holdings
– providerCode: PRVAVX
  databaseName: SpringerLINK - Czech Republic Consortium
  customDbUrl:
  eissn: 1862-4480
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0061199
  issn: 1862-4472
  databaseCode: AGYKE
  dateStart: 20070101
  isFulltext: true
  titleUrlDefault: http://link.springer.com
  providerName: Springer Nature
– providerCode: PRVAVX
  databaseName: SpringerLink Journals (ICM)
  customDbUrl:
  eissn: 1862-4480
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0061199
  issn: 1862-4472
  databaseCode: U2A
  dateStart: 20070101
  isFulltext: true
  titleUrlDefault: http://www.springerlink.com/journals/
  providerName: Springer Nature
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3NS8MwFA-6XfTgtzg_Rg7eNKNt0nY9brI5FHdyMk-l-YLh1snawfCv96VNnRMRvDWQhvDymvdL836_h9C1dkSbKa0JYG9KGAvhKYhcwkNOHU2LPAyTbTEMBiP2MPbHlhSWVdnu1ZVksVOvyW4QeR0CMYYYQmdEADnWC72tGqp37l8fe9UOHLhl3Ui3bRhBLPQsWeb3UTYD0uZtaBFk-vtoVE2vzC15ay1z3hIfP5Qb_zv_A7RnUSfulG5yiLZUeoR2v2kRQuvpS8A1O0bxC5yhjUPghdF2VRLPil8PmCfTaYYLJfLVpKQ94rIKNQb4i7PZHJYeC4M6TfEJ8-Ikncws2xPb-jXZCRr1e893A2JrMRABQTwnHmc64G1fa0W1cqmMHMmUUJ6ACO8oHsnAD2koqEiiULmCCZm4Qro-V1IJ36GnqJbOU3WGsAMIjTJP-4JzOJzLRCcMcJEPyBDAKWUNdFMtSPxeSm7Ea3FlY8QYjBgXRoxXDXRbmTu2n1_2R_fz_3W_QDseoJgyN-wS1fLFUl0BCsl5E5yu3-0Om9b5mmh75HU-AXIg2B4
linkProvider Springer Nature
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwFA4yD-pB_InzZw7eNNA2SbsehzimbjttsltofsFg62StsD_fl7R1DkTw1kLaw3tpvi_N-76H0L0NVIcZawlwb0oYS-AqTkMiE0kDS30dhqu2GMX9CXud8mktCiuaavfmSNKv1BuxGyBvQABjiBN0pgSY464zsHKO-ZOo26y_cVh1jQw7Tg_EkqiWyvz-jm042j4L9RDTO0KHNTfE3SqZx2jH5Cfo4IdjINwNv21Wi1Mk3mGn69KGV86B1Wi88D8IsMzm8wJ7v_D1rBIn4qpXNAaSiovFEhKEleOGrkWEe3CWzxa1JhPXXWaKMzTpPY-f-qTumEAUQG1JIslsLDvcWkOtCalOA82MMpECHA6MTHXME5ooqrI0MaFiSmeh0iGXRhvFA3qOWvkyNxcIB8CjKIssV1LCFlpnNmPAXjjwN6CQlLXRQxM48VEZY4iNBbILs4AwCx9msW6jxya2ov5Iij-GX_5v-B3a64-HAzF4Gb1dof0IeEdVzXWNWuXq09wAbyjlrZ8mXzbTu2Q
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1bS8MwFA4yQfRBvOLmLQ--aVjbJO36ONQxb8MHJ3sLzQ0GWzfWCvv5nvTiNhDBtxbSPpyT9PvSnO87CN1YT3WYsZYA96aEsQiuwtgnMpLUs7Sow3DVFoOwP2TPIz5aU_EX1e71kWSpaXAuTWnenmvbXgnfAIU9AnhDnLgzJsAit5kzSoAZPQy69bc49MsOkn7HaYNYFFSymd_fsQlNm-eiBdz0DtB-xRNxt0zsIdoy6RHaW3MPhLu3H8vV7BiJT9j1uhTihXNjNRpPi58FWCaTSYYL7_DluBQq4rJvNAbCirPpDJKFleOJrl2Ee3CcjqeVPhNXHWeyEzTsPX7c90nVPYEogN2cBJLZUHa4tYZa41Mde5oZZQIFmOwZGeuQRzRSVCVxZHzFlE58pX0ujTaKe_QUNdJZas4Q9oBTURZYrqSE7bRObMKAyXDgckAnKWui2zpwYl6aZIiVHbILs4AwiyLMYtlEd3VsRbVgsj-Gt_43_BrtvD_0xOvT4OUc7QZAQcrCrgvUyBdf5hIoRC6vilnyDTVSv6A
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Variance+reduced+moving+balls+approximation+method+for+smooth+constrained+minimization+problems&rft.jtitle=Optimization+letters&rft.au=Yang%2C+Zhichun&rft.au=Xia%2C+Fu-quan&rft.au=Tu%2C+Kai&rft.date=2024-06-01&rft.pub=Springer+Berlin+Heidelberg&rft.issn=1862-4472&rft.eissn=1862-4480&rft.volume=18&rft.issue=5&rft.spage=1253&rft.epage=1271&rft_id=info:doi/10.1007%2Fs11590-023-02049-x&rft.externalDocID=10_1007_s11590_023_02049_x
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1862-4472&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1862-4472&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1862-4472&client=summon