Variable Inclusion and Shrinkage Algorithms

The Lasso is a popular and computationally efficient procedure for automatically performing both variable selection and coefficient shrinkage on linear regression models. One limitation of the Lasso is that the same tuning parameter is used for both variable selection and shrinkage. As a result, it...

Full description

Saved in:
Bibliographic Details
Published inJournal of the American Statistical Association Vol. 103; no. 483; pp. 1304 - 1315
Main Authors Radchenko, Peter, James, Gareth M
Format Journal Article
LanguageEnglish
Published Alexandria, VA Taylor & Francis 01.09.2008
American Statistical Association
Taylor & Francis Ltd
Subjects
Online AccessGet full text
ISSN0162-1459
1537-274X
DOI10.1198/016214508000000481

Cover

More Information
Summary:The Lasso is a popular and computationally efficient procedure for automatically performing both variable selection and coefficient shrinkage on linear regression models. One limitation of the Lasso is that the same tuning parameter is used for both variable selection and shrinkage. As a result, it typically ends up selecting a model with too many variables to prevent overshrinkage of the regression coefficients. We suggest an improved class of methods called variable inclusion and shrinkage algorithms (VISA). Our approach is capable of selecting sparse models while avoiding overshrinkage problems and uses a path algorithm, and so also is computationally efficient. We show through extensive simulations that VISA significantly outperforms the Lasso and also provides improvements over more recent procedures, such as the Dantzig selector, relaxed Lasso, and adaptive Lasso. In addition, we provide theoretical justification for VISA in terms of nonasymptotic bounds on the estimation error that suggest it should exhibit good performance even for large numbers of predictors. Finally, we extend the VISA methodology, path algorithm, and theoretical bounds to the generalized linear models framework.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0162-1459
1537-274X
DOI:10.1198/016214508000000481