Quasi-Bayes properties of a procedure for sequential learning in mixture models

Bayesian methods are often optimal, yet increasing pressure for fast computations, especially with streaming data, brings renewed interest in faster, possibly suboptimal, solutions. The extent to which these algorithms approximate Bayesian solutions is a question of interest, but often unanswered. W...

Full description

Saved in:
Bibliographic Details
Published inJournal of the Royal Statistical Society. Series B, Statistical methodology Vol. 82; no. 4; pp. 1087 - 1114
Main Authors Fortini, Sandra, Petrone, Sonia
Format Journal Article
LanguageEnglish
Published Oxford Wiley 01.09.2020
Oxford University Press
Subjects
Online AccessGet full text
ISSN1369-7412
1467-9868
DOI10.1111/rssb.12385

Cover

More Information
Summary:Bayesian methods are often optimal, yet increasing pressure for fast computations, especially with streaming data, brings renewed interest in faster, possibly suboptimal, solutions. The extent to which these algorithms approximate Bayesian solutions is a question of interest, but often unanswered. We propose a methodology to address this question in predictive settings, when the algorithm can be reinterpreted as a probabilistic predictive rule. We specifically develop the proposed methodology for a recursive procedure for on-line learning in non-parametric mixture models, which is often referred to as Newton’s algorithm. This algorithm is simple and fast; however, its approximation properties are unclear. By reinterpreting it as a predictive rule, we can show that it underlies a statistical model which is, asymptotically, a Bayesian, exchangeable mixture model. In this sense, the recursive rule provides a quasi-Bayes solution. Although the algorithm offers only a point estimate, our clean statistical formulation enables us to provide the asymptotic posterior distribution and asymptotic credible intervals for the mixing distribution. Moreover, it gives insights for tuning the parameters, as we illustrate in simulation studies, and paves the way to extensions in various directions. Beyond mixture models, our approach can be applied to other predictive algorithms.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1369-7412
1467-9868
DOI:10.1111/rssb.12385