Monte Carlo Approximation of Bayes Factors via Mixing With Surrogate Distributions

By mixing the target posterior distribution with a surrogate distribution, of which the normalizing constant is tractable, we propose a method for estimating the marginal likelihood using the Wang-Landau algorithm. We show that a faster convergence of the proposed method can be achieved via the mome...

Full description

Saved in:
Bibliographic Details
Published inJournal of the American Statistical Association Vol. 117; no. 538; pp. 765 - 780
Main Authors Dai, Chenguang, Liu, Jun S.
Format Journal Article
LanguageEnglish
Published Alexandria Taylor & Francis 03.04.2022
Taylor & Francis Ltd
Subjects
Online AccessGet full text
ISSN0162-1459
1537-274X
1537-274X
DOI10.1080/01621459.2020.1811100

Cover

More Information
Summary:By mixing the target posterior distribution with a surrogate distribution, of which the normalizing constant is tractable, we propose a method for estimating the marginal likelihood using the Wang-Landau algorithm. We show that a faster convergence of the proposed method can be achieved via the momentum acceleration. Two implementation strategies are detailed: (i) facilitating global jumps between the posterior and surrogate distributions via the multiple-try Metropolis (MTM); (ii) constructing the surrogate via the variational approximation. When a surrogate is difficult to come by, we describe a new jumping mechanism for general reversible jump Markov chain Monte Carlo algorithms, which combines the MTM and a directional sampling algorithm. We illustrate the proposed methods on several statistical models, including the log-Gaussian Cox process, the Bayesian Lasso, the logistic regression, and the g-prior Bayesian variable selection. Supplementary materials for this article are available online.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0162-1459
1537-274X
1537-274X
DOI:10.1080/01621459.2020.1811100