Efficient sparse high-dimensional linear regression with a partitioned empirical Bayes ECM algorithm

Bayesian variable selection methods are powerful techniques for fitting sparse high-dimensional linear regression models. However, many are computationally intensive or require restrictive prior distributions on model parameters. A computationally efficient and powerful Bayesian approach is presente...

Full description

Saved in:
Bibliographic Details
Published inComputational statistics & data analysis Vol. 207; p. 108146
Main Authors McLain, Alexander C., Zgodic, Anja, Bondell, Howard
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.07.2025
Subjects
Online AccessGet full text
ISSN0167-9473
DOI10.1016/j.csda.2025.108146

Cover

More Information
Summary:Bayesian variable selection methods are powerful techniques for fitting sparse high-dimensional linear regression models. However, many are computationally intensive or require restrictive prior distributions on model parameters. A computationally efficient and powerful Bayesian approach is presented for sparse high-dimensional linear regression, requiring only minimal prior assumptions on parameters through plug-in empirical Bayes estimates of hyperparameters. The method employs a Parameter-Expanded Expectation-Conditional-Maximization (PX-ECM) algorithm to estimate maximum a posteriori (MAP) values of parameters via computationally efficient coordinate-wise optimization. The popular two-group approach to multiple testing motivates the E-step, resulting in a PaRtitiOned empirical Bayes Ecm (PROBE) algorithm for sparse high-dimensional linear regression. Both one-at-a-time and all-at-once optimization can be used to complete PROBE. Extensive simulation studies and analyses of cancer cell drug responses are conducted to compare PROBE's empirical properties with those of related methods. Implementation is available through the R package probe.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0167-9473
DOI:10.1016/j.csda.2025.108146