On l q Optimization and Matrix Completion

Rank minimization problems, which consist of finding a matrix of minimum rank subject to linear constraints, have been proposed in many areas of engineering and science. A specific problem is the matrix completion problem in which a low rank data matrix can be recovered from incomplete samples of it...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on signal processing Vol. 60; no. 11; pp. 5714 - 5724
Main Authors Marjanovic, Goran, Solo, Victor
Format Journal Article
LanguageEnglish
Published 01.11.2012
Subjects
Online AccessGet full text
ISSN1053-587X
1941-0476
DOI10.1109/TSP.2012.2212015

Cover

More Information
Summary:Rank minimization problems, which consist of finding a matrix of minimum rank subject to linear constraints, have been proposed in many areas of engineering and science. A specific problem is the matrix completion problem in which a low rank data matrix can be recovered from incomplete samples of its entries by solving a rank penalized least squares problem. The rank penalty is in fact the l 0 "norm" of the matrix singular values. A recent convex relaxation of this penalty is the commonly used l 1 norm of the matrix singular values. In this paper, we bridge the gap between these two penalties and propose the l q , 0 < q < 1 penalized least squares problem for matrix completion. An iterative algorithm is developed by solving a non-standard optimization problem and a non-trivial convergence result is proved. We illustrate with simulations comparing the reconstruction quality of the three matrix singular value penalty functions: l 0 , l 1 , and l q , 0 < q < 1 .
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2012.2212015