Empirical and computer-aided robustness analysis of long-step and accelerated methods in smooth convex optimization

This work assesses both empirically and theoretically, using the performance estimation methodology, how robust different first-order optimization methods are when subject to relative inexactness in their gradient computations. Relative inexactness occurs, for example, when compressing the gradient...

Full description

Saved in:
Bibliographic Details
Main Authors Vernimmen, Pierre, Glineur, François
Format Journal Article
LanguageEnglish
Published 30.06.2025
Subjects
Online AccessGet full text
DOI10.48550/arxiv.2506.09730

Cover

Abstract This work assesses both empirically and theoretically, using the performance estimation methodology, how robust different first-order optimization methods are when subject to relative inexactness in their gradient computations. Relative inexactness occurs, for example, when compressing the gradient using fewer bits of information, which happens when dealing with large-scale problems on GPUs. Three major families of methods are analyzed: constant step gradient descent, long-step methods, and accelerated methods. The latter two are first shown to be theoretically not robust to inexactness. Then, a semi-heuristic shortening factor is introduced to improve their theoretical guarantees. All methods are subsequently tested on a concrete inexact problem, with two different types of relative inexactness, and it is observed that both accelerated methods are much more robust than expected, and that the shortening factor significantly helps the long-step methods. In the end, all shortened methods appear to be promising, even in this inexact setting.
AbstractList This work assesses both empirically and theoretically, using the performance estimation methodology, how robust different first-order optimization methods are when subject to relative inexactness in their gradient computations. Relative inexactness occurs, for example, when compressing the gradient using fewer bits of information, which happens when dealing with large-scale problems on GPUs. Three major families of methods are analyzed: constant step gradient descent, long-step methods, and accelerated methods. The latter two are first shown to be theoretically not robust to inexactness. Then, a semi-heuristic shortening factor is introduced to improve their theoretical guarantees. All methods are subsequently tested on a concrete inexact problem, with two different types of relative inexactness, and it is observed that both accelerated methods are much more robust than expected, and that the shortening factor significantly helps the long-step methods. In the end, all shortened methods appear to be promising, even in this inexact setting.
Author Vernimmen, Pierre
Glineur, François
Author_xml – sequence: 1
  givenname: Pierre
  surname: Vernimmen
  fullname: Vernimmen, Pierre
– sequence: 2
  givenname: François
  surname: Glineur
  fullname: Glineur, François
BackLink https://doi.org/10.48550/arXiv.2506.09730$$DView paper in arXiv
BookMark eNqFjjsOwjAQRF1Awe8AVPgCCeYTPjUCcQD6aEkWWMn2Wl6DgNMTInqqKWbe6PVVx7NHpcYzky83RWGmEJ_0yOeFWeVmu16YnpK9CxSpAqvB17piF-4JYwZUY60jn--SPIo0LdiXkGi-aMv-mknC0DJQVWgxQmoAh-nGtWjyWhxzujWP_oFPzSGRozckYj9U3QtYwdEvB2py2J92x6zVK0MkB_FVfjXLVnPxf_EBY4JMqg
ContentType Journal Article
Copyright http://arxiv.org/licenses/nonexclusive-distrib/1.0
Copyright_xml – notice: http://arxiv.org/licenses/nonexclusive-distrib/1.0
DBID GOX
DOI 10.48550/arxiv.2506.09730
DatabaseName arXiv.org
DatabaseTitleList
Database_xml – sequence: 1
  dbid: GOX
  name: arXiv.org
  url: http://arxiv.org/find
  sourceTypes: Open Access Repository
DeliveryMethod fulltext_linktorsrc
ExternalDocumentID 2506_09730
GroupedDBID GOX
ID FETCH-arxiv_primary_2506_097303
IEDL.DBID GOX
IngestDate Tue Jul 22 20:28:54 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-arxiv_primary_2506_097303
OpenAccessLink https://arxiv.org/abs/2506.09730
ParticipantIDs arxiv_primary_2506_09730
PublicationCentury 2000
PublicationDate 2025-06-30
PublicationDateYYYYMMDD 2025-06-30
PublicationDate_xml – month: 06
  year: 2025
  text: 2025-06-30
  day: 30
PublicationDecade 2020
PublicationYear 2025
Score 3.8330019
SecondaryResourceType preprint
Snippet This work assesses both empirically and theoretically, using the performance estimation methodology, how robust different first-order optimization methods are...
SourceID arxiv
SourceType Open Access Repository
SubjectTerms Computer Science - Learning
Mathematics - Optimization and Control
Title Empirical and computer-aided robustness analysis of long-step and accelerated methods in smooth convex optimization
URI https://arxiv.org/abs/2506.09730
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdZ09T8MwEIZPpRMLAgEq3zewGkrqJO2IUKsKCVhAyhbZjoMikTiKU9Sfz9kOgqVrzonir9y9iu85gNsklmUaFZxxsjM-F5oJLhwQQJB4oCWymLtE4ZfXZP3Bn7M4GwH-5sKIblt9Bz6wtPfkn5M7B5QhUb5HgYJL5n3Lws9Jj-Ia2v-1oxjTX_rnJFaHcDBEd_gYpuMIRro5Brus28qTOJBkO6qhjgJzcMYCOyM3tncfHLIGQAiaEr9M88loBlp_j1CK3IOjOhQYaj5brBq0taGBRn9yfIuGdn89pFWewM1q-f60Zv418zYwJXLXg9z3YHYKY1L-egLIYz1VaekkmuRqUYipSlSqIvVAUZSMZ2cw2fWU892mC9iPXBFbf-jtEsZ9t9FX5Fl7ee2H9wdkp4CD
linkProvider Cornell University
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Empirical+and+computer-aided+robustness+analysis+of+long-step+and+accelerated+methods+in+smooth+convex+optimization&rft.au=Vernimmen%2C+Pierre&rft.au=Glineur%2C+Fran%C3%A7ois&rft.date=2025-06-30&rft_id=info:doi/10.48550%2Farxiv.2506.09730&rft.externalDocID=2506_09730