Multi-class AdaBoost with Hypothesis Margin
Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the LogitBoost. This paper proposes a new multi-class AdaBoost algorithm based on hypothesis margin, called AdaBoost.HM, which directly combines...
Saved in:
| Published in | 2010 20th International Conference on Pattern Recognition pp. 65 - 68 |
|---|---|
| Main Authors | , , |
| Format | Conference Proceeding |
| Language | English |
| Published |
IEEE
01.08.2010
|
| Subjects | |
| Online Access | Get full text |
| ISBN | 1424475422 9781424475421 |
| ISSN | 1051-4651 |
| DOI | 10.1109/ICPR.2010.25 |
Cover
| Summary: | Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the LogitBoost. This paper proposes a new multi-class AdaBoost algorithm based on hypothesis margin, called AdaBoost.HM, which directly combines multi-class weak classifiers. The hypothesis margin maximizes the output about the positive class meanwhile minimizes the maximal outputs about the negative classes. We discuss the upper bound of the training error about AdaBoost.HM and a previous multi-class learning algorithm AdaBoost.M1. Our experiments using feed forward neural networks as weak learners show that the proposed AdaBoost.HM yields higher classification accuracies than the AdaBoost.M1 and the AdaBoost.MH, and meanwhile, AdaBoost.HM is computationally efficient in training. |
|---|---|
| ISBN: | 1424475422 9781424475421 |
| ISSN: | 1051-4651 |
| DOI: | 10.1109/ICPR.2010.25 |