Gaussian mixture modelling by exploiting competitive stop EM algorithm

To improve the robustness of order selection and parameter learning for Gaussian mixture model (GMM), this paper proposes a competitive stop expectation-maximization (EM) algorithm, which is based on two stop conditions. The first condition is a Lilliefors test based multivariate (MV) normality crit...

Full description

Saved in:
Bibliographic Details
Published inJournal of physics. Conference series Vol. 2234; no. 1; pp. 12003 - 12008
Main Authors Jia, Kexin, Xin, Yuxia, Cheng, Ting
Format Journal Article
LanguageEnglish
Published Bristol IOP Publishing 01.04.2022
Subjects
Online AccessGet full text
ISSN1742-6588
1742-6596
1742-6596
DOI10.1088/1742-6596/2234/1/012003

Cover

More Information
Summary:To improve the robustness of order selection and parameter learning for Gaussian mixture model (GMM), this paper proposes a competitive stop expectation-maximization (EM) algorithm, which is based on two stop conditions. The first condition is a Lilliefors test based multivariate (MV) normality criterion, which is used to determine whether to split a component into two different components. The EM algorithm stops splitting when all components have MV normality. The minimum description length (MDL) criterion is used in the second condition, which competes with the first condition to prevent the EM algorithm from over-splitting. Simulation experiments verify the effectiveness of the proposed algorithm.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1742-6588
1742-6596
1742-6596
DOI:10.1088/1742-6596/2234/1/012003