Single-Loop Projection-Free and Projected Gradient-Based Algorithms for Nonconvex-Concave Saddle Point Problems with Bilevel Structure
In this paper, we explore a broad class of constrained saddle point problems with a bilevel structure, wherein the upper-level objective function is nonconvex-concave and smooth over compact and convex constraint sets, subject to a strongly convex lower-level objective function. This class of proble...
        Saved in:
      
    
          | Published in | Journal of scientific computing Vol. 103; no. 2; p. 52 | 
|---|---|
| Main Authors | , | 
| Format | Journal Article | 
| Language | English | 
| Published | 
        New York
          Springer US
    
        01.05.2025
     Springer Nature B.V  | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 0885-7474 1573-7691  | 
| DOI | 10.1007/s10915-025-02864-7 | 
Cover
| Summary: | In this paper, we explore a broad class of constrained saddle point problems with a bilevel structure, wherein the upper-level objective function is nonconvex-concave and smooth over compact and convex constraint sets, subject to a strongly convex lower-level objective function. This class of problems finds wide applicability in machine learning, encompassing robust multi-task learning, adversarial learning, and robust meta-learning. Our study extends the current literature in two main directions: (i) We consider a more general setting where the upper-level function is not necessarily strongly concave or linear in the maximization variable. (ii) While existing methods for solving saddle point problems with a bilevel structure are projected-based algorithms, we propose a one-sided projection-free method employing a linear minimization oracle. Specifically, by utilizing regularization and nested approximation techniques, we introduce a novel single-loop one-sided projection-free algorithm, requiring
O
(
ϵ
-
4
)
iterations to attain an
ϵ
-stationary solution, moreover, when the objective function in the upper-level is linear in the maximization component, our result improve to
O
(
ϵ
-
3
)
. Subsequently, we develop an efficient single-loop fully projected gradient-based algorithm capable of achieving an
ϵ
-stationary solution within
O
(
ϵ
-
5
)
iterations. This result improves to
O
(
ϵ
-
4
)
when the upper-level objective function is strongly concave in the maximization component. Finally, we tested our proposed methods against the state-of-the-art algorithms for solving a robust multi-task regression problem to showcase the superiority of our algorithms. | 
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14  | 
| ISSN: | 0885-7474 1573-7691  | 
| DOI: | 10.1007/s10915-025-02864-7 |