A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints

In this paper we propose a variant of the random coordinate descent method for solving linearly constrained convex optimization problems with composite objective functions. If the smooth part of the objective function has Lipschitz continuous gradient, then we prove that our method obtains an ϵ -opt...

Full description

Saved in:
Bibliographic Details
Published inComputational optimization and applications Vol. 57; no. 2; pp. 307 - 337
Main Authors Necoara, Ion, Patrascu, Andrei
Format Journal Article
LanguageEnglish
Published Boston Springer US 01.03.2014
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0926-6003
1573-2894
DOI10.1007/s10589-013-9598-8

Cover

More Information
Summary:In this paper we propose a variant of the random coordinate descent method for solving linearly constrained convex optimization problems with composite objective functions. If the smooth part of the objective function has Lipschitz continuous gradient, then we prove that our method obtains an ϵ -optimal solution in iterations, where n is the number of blocks. For the class of problems with cheap coordinate derivatives we show that the new method is faster than methods based on full-gradient information. Analysis for the rate of convergence in probability is also provided. For strongly convex functions our method converges linearly. Extensive numerical tests confirm that on very large problems, our method is much more numerically efficient than methods based on full gradient information.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0926-6003
1573-2894
DOI:10.1007/s10589-013-9598-8