Dictionary Learning for Sparse Representation: A Novel Approach

A dictionary learning problem is a matrix factorization in which the goal is to factorize a training data matrix, Y, as the product of a dictionary, D, and a sparse coefficient matrix, X, as follows, Y ≃ DX. Current dictionary learning algorithms minimize the representation error subject to a constr...

Full description

Saved in:
Bibliographic Details
Published inIEEE signal processing letters Vol. 20; no. 12; pp. 1195 - 1198
Main Authors Sadeghi, Mostafa, Babaie-Zadeh, Massoud, Jutten, Christian
Format Journal Article
LanguageEnglish
Published IEEE 01.12.2013
Subjects
Online AccessGet full text
ISSN1070-9908
1558-2361
1558-2361
DOI10.1109/LSP.2013.2285218

Cover

More Information
Summary:A dictionary learning problem is a matrix factorization in which the goal is to factorize a training data matrix, Y, as the product of a dictionary, D, and a sparse coefficient matrix, X, as follows, Y ≃ DX. Current dictionary learning algorithms minimize the representation error subject to a constraint on D (usually having unit column-norms) and sparseness of X. The resulting problem is not convex with respect to the pair (D,X). In this letter, we derive a first order series expansion formula for the factorization, DX. The resulting objective function is jointly convex with respect to D and X. We simply solve the resulting problem using alternating minimization and apply some of the previously suggested algorithms onto our new problem. Simulation results on recovery of a known dictionary and dictionary learning for natural image patches show that our new problem considerably improves performance with a little additional computational load.
ISSN:1070-9908
1558-2361
1558-2361
DOI:10.1109/LSP.2013.2285218