Multi-class AdaBoost with Hypothesis Margin

Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the LogitBoost. This paper proposes a new multi-class AdaBoost algorithm based on hypothesis margin, called AdaBoost.HM, which directly combines...

Full description

Saved in:
Bibliographic Details
Published in2010 20th International Conference on Pattern Recognition pp. 65 - 68
Main Authors Xiaobo Jin, Xinwen Hou, Cheng-Lin Liu
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.08.2010
Subjects
Online AccessGet full text
ISBN1424475422
9781424475421
ISSN1051-4651
DOI10.1109/ICPR.2010.25

Cover

More Information
Summary:Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the LogitBoost. This paper proposes a new multi-class AdaBoost algorithm based on hypothesis margin, called AdaBoost.HM, which directly combines multi-class weak classifiers. The hypothesis margin maximizes the output about the positive class meanwhile minimizes the maximal outputs about the negative classes. We discuss the upper bound of the training error about AdaBoost.HM and a previous multi-class learning algorithm AdaBoost.M1. Our experiments using feed forward neural networks as weak learners show that the proposed AdaBoost.HM yields higher classification accuracies than the AdaBoost.M1 and the AdaBoost.MH, and meanwhile, AdaBoost.HM is computationally efficient in training.
ISBN:1424475422
9781424475421
ISSN:1051-4651
DOI:10.1109/ICPR.2010.25