Lagrangian support vector regression via unconstrained convex minimization

In this paper, a simple reformulation of the Lagrangian dual of the 2-norm support vector regression (SVR) is proposed as an unconstrained minimization problem. This formulation has the advantage that its objective function is strongly convex and further having only m variables, where m is the numbe...

Full description

Saved in:
Bibliographic Details
Published inNeural networks Vol. 51; pp. 67 - 79
Main Authors Balasundaram, S., Gupta, Deepak, Kapil
Format Journal Article
LanguageEnglish
Published Kidlington Elsevier Ltd 01.03.2014
Elsevier
Subjects
Online AccessGet full text
ISSN0893-6080
1879-2782
1879-2782
DOI10.1016/j.neunet.2013.12.003

Cover

More Information
Summary:In this paper, a simple reformulation of the Lagrangian dual of the 2-norm support vector regression (SVR) is proposed as an unconstrained minimization problem. This formulation has the advantage that its objective function is strongly convex and further having only m variables, where m is the number of input data points. The proposed unconstrained Lagrangian SVR (ULSVR) is solvable by computing the zeros of its gradient. However, since its objective function contains the non-smooth ‘plus’ function, two approaches are followed to solve the proposed optimization problem: (i) by introducing a smooth approximation, generate a slightly modified unconstrained minimization problem and solve it; (ii) solve the problem directly by applying generalized derivative. Computational results obtained on a number of synthetic and real-world benchmark datasets showing similar generalization performance with much faster learning speed in accordance with the conventional SVR and training time very close to least squares SVR clearly indicate the superiority of ULSVR solved by smooth and generalized derivative approaches.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0893-6080
1879-2782
1879-2782
DOI:10.1016/j.neunet.2013.12.003