Making up the shortages of the Bayes classifier by the maximum mutual information classifier

The Bayes classifier is often used because it is simple, and the maximum posterior probability (MPP) criterion it uses is equivalent to the least error rate criterion. However, it has issues in the following circumstances: (i) if information instead of correctness is more important, we should use th...

Full description

Saved in:
Bibliographic Details
Published inJournal of engineering (Stevenage, England) Vol. 2020; no. 13; pp. 659 - 663
Main Authors Lu, Chenguang, Zou, Xiaohui, Wang, Wenfeng, Chen, Xiaofeng
Format Journal Article
LanguageEnglish
Published The Institution of Engineering and Technology 01.07.2020
Subjects
Online AccessGet full text
ISSN2051-3305
2051-3305
DOI10.1049/joe.2019.1157

Cover

More Information
Summary:The Bayes classifier is often used because it is simple, and the maximum posterior probability (MPP) criterion it uses is equivalent to the least error rate criterion. However, it has issues in the following circumstances: (i) if information instead of correctness is more important, we should use the maximum likelihood criterion or maximum information criterion, which can reduce the rate of failure to report small probability events. (ii) For unseen instance classifications, the previously optimised classifier cannot be properly used when the probability distribution of true classes is changed. (iii) When classes’ feature distributions instead of transition probability functions (TPFs) are stable, it is improper to train the TPF, such as the logistic function, with parameters. (iv) For multi-label classifications, it is difficult to optimise a group of TPFs with parameters that the Bayes classifier needs. This study addresses these issues by comparing the MPP criterion with the maximum likelihood criterion and maximum mutual information (MMI) criterion. It suggests using the MMI criterion for most unseen instance classifications. It presents a new iterative algorithm, the channel matching (CM) algorithm, for the MMI classification. It uses two examples to show the advantages of the CM algorithm: fast and reliable.
ISSN:2051-3305
2051-3305
DOI:10.1049/joe.2019.1157