Multi-class AdaBoost with Hypothesis Margin

Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the LogitBoost. This paper proposes a new multi-class AdaBoost algorithm based on hypothesis margin, called AdaBoost.HM, which directly combines...

Full description

Saved in:
Bibliographic Details
Published in2010 20th International Conference on Pattern Recognition pp. 65 - 68
Main Authors Xiaobo Jin, Xinwen Hou, Cheng-Lin Liu
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.08.2010
Subjects
Online AccessGet full text
ISBN1424475422
9781424475421
ISSN1051-4651
DOI10.1109/ICPR.2010.25

Cover

Abstract Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the LogitBoost. This paper proposes a new multi-class AdaBoost algorithm based on hypothesis margin, called AdaBoost.HM, which directly combines multi-class weak classifiers. The hypothesis margin maximizes the output about the positive class meanwhile minimizes the maximal outputs about the negative classes. We discuss the upper bound of the training error about AdaBoost.HM and a previous multi-class learning algorithm AdaBoost.M1. Our experiments using feed forward neural networks as weak learners show that the proposed AdaBoost.HM yields higher classification accuracies than the AdaBoost.M1 and the AdaBoost.MH, and meanwhile, AdaBoost.HM is computationally efficient in training.
AbstractList Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the LogitBoost. This paper proposes a new multi-class AdaBoost algorithm based on hypothesis margin, called AdaBoost.HM, which directly combines multi-class weak classifiers. The hypothesis margin maximizes the output about the positive class meanwhile minimizes the maximal outputs about the negative classes. We discuss the upper bound of the training error about AdaBoost.HM and a previous multi-class learning algorithm AdaBoost.M1. Our experiments using feed forward neural networks as weak learners show that the proposed AdaBoost.HM yields higher classification accuracies than the AdaBoost.M1 and the AdaBoost.MH, and meanwhile, AdaBoost.HM is computationally efficient in training.
Author Cheng-Lin Liu
Xiaobo Jin
Xinwen Hou
Author_xml – sequence: 1
  surname: Xiaobo Jin
  fullname: Xiaobo Jin
  email: xbjin@nlpr.ia.ac.cn
  organization: Nat. Lab. of Pattern Recognition, Chinese Acad. of Sci., Beijing, China
– sequence: 2
  surname: Xinwen Hou
  fullname: Xinwen Hou
  email: xwhou@nlpr.ia.ac.cn
  organization: Nat. Lab. of Pattern Recognition, Chinese Acad. of Sci., Beijing, China
– sequence: 3
  surname: Cheng-Lin Liu
  fullname: Cheng-Lin Liu
  email: liucl@nlpr.ia.ac.cn
  organization: Nat. Lab. of Pattern Recognition, Chinese Acad. of Sci., Beijing, China
BookMark eNo1zM1PwjAYgPEaMRFwN29edjeD92379uOICwoJBGO4k5Z10mRuZJ0x_PeaqKcnv8szYaO2awNj9wgzRLDzdfn6NuPwQ05XLLPaoORSapIor9nkH5yP2BiBsJCK8JZlKUUPXGmliWjMHrefzRCLY-NSyheVe-q6NORfcTjlq8u5G04hxZRvXf8e2zt2U7smheyvU7Z_Xu7LVbHZvazLxaaIFobCGy9Boqg96VpJ4ABUOV_ZozNG6EoqbbhwSgvjyYMJFXIRBCEp0MpbMWUPv9sYQjic-_jh-suByGrFrfgGFvND-A
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/ICPR.2010.25
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISBN 9781424475414
9780769541099
1424475414
0769541097
EndPage 68
ExternalDocumentID 5597629
Genre orig-research
GroupedDBID 29J
6IE
6IF
6IK
6IL
6IN
AAJGR
AAWTH
ADZIZ
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CBEJK
CHZPO
IEGSK
IPLJI
OCL
RIE
RIL
RNS
ID FETCH-LOGICAL-i90t-b8b40413fb57f6402005dabd9ca8837d467823a6738b5b08ed123e35156076b93
IEDL.DBID RIE
ISBN 1424475422
9781424475421
ISSN 1051-4651
IngestDate Wed Aug 27 02:53:00 EDT 2025
IsPeerReviewed false
IsScholarly true
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i90t-b8b40413fb57f6402005dabd9ca8837d467823a6738b5b08ed123e35156076b93
PageCount 4
ParticipantIDs ieee_primary_5597629
PublicationCentury 2000
PublicationDate 2010-Aug.
PublicationDateYYYYMMDD 2010-08-01
PublicationDate_xml – month: 08
  year: 2010
  text: 2010-Aug.
PublicationDecade 2010
PublicationTitle 2010 20th International Conference on Pattern Recognition
PublicationTitleAbbrev ICPR
PublicationYear 2010
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssib026767555
ssj0020358
ssj0000452726
Score 1.8434638
Snippet Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the...
SourceID ieee
SourceType Publisher
StartPage 65
SubjectTerms Accuracy
Additives
Artificial neural networks
Boosting
Error analysis
Training
Upper bound
Title Multi-class AdaBoost with Hypothesis Margin
URI https://ieeexplore.ieee.org/document/5597629
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3PT8IwFH4BTnpBAePv9OBNB2Ndu-2oRIImGGIw4Ub6awkx2YiMg_719nUbEuPBW9dT1772vde-7_sAblIbgltb0F5CQ-GFqbRbKuLM85XmNOTCdiDeefrCJ2_h84ItGnC3w8IYY1zxmelj073l61xt8apsgNEvD5ImNKOYl1it2nYCRzxWYSzdKRyyIMLQokq-fMpKWByzORNnwxrkhQqwQc39VH0PdxXyyeBpNHstK8BQTXtPgcU5oHEbpvXQy7qT9_62kH319YvV8b__dgS9H6gfme2c2DE0TNaBdq31QKqt34HDPeLCLtw63K6nMPQm91o85PmmIHinSyafawR1bVYbgiK6q6wH8_HjfDTxKtkFb5X4hSdjaVdvSFPJopRjeukzLaROlIhtNqvtyRoHVKBaqGTSj422zs9QhpDsiMuEnkAryzNzCkSzFMmmlLaGEkaBkJIaP06pQkSwNeEz6OJELNclscaymoPzv7sv4KB8usfqu0toFR9bc2UjgkJeO1P4Bo_lq1Q
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3PT8IwFH5BPKgXFDD-dgdvOhjrj21HNZKhQIjBhBtZ1y4hJhuRcdC_3r5uQ2I8eOt66trXvvfa930fwE2iQ3BtC9IOCI1smgi9pTzObCeWnFAe6Q7EO4_GPHyjzzM2q8HdBgujlDLFZ6qDTfOWL7N4jVdlXYx-uRvswC6jlLICrVVZj2uox0qUpTmHKXM9DC7K9MshrADGMZ01cdarYF6oAetW7E_ld29TIx90B4-T16IGDPW0tzRYjAvqN2BUDb6oPHnvrHPRib9-8Tr-9-8Oof0D9rMmGzd2BDWVNqFRqT1Y5eZvwsEWdWELbg1y144x-LbuZfSQZavcwltdK_xcIqxrtVhZKKO7SNsw7T9NH0O7FF6wF4GT28IXev16JBHMSzgmmA6TkZBBHPk6n5X6bPVdEqFeqGDC8ZXU7k8RhqBsj4uAHEM9zVJ1ApZkCdJNxVKbCvXcSAiiHD8hMWKCtRGfQgsnYr4sqDXm5Ryc_d19DXvhdDScDwfjl3PYLx7ysRbvAur5x1pd6vggF1fGLL4BxCCuoQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2010+20th+International+Conference+on+Pattern+Recognition&rft.atitle=Multi-class+AdaBoost+with+Hypothesis+Margin&rft.au=Xiaobo+Jin&rft.au=Xinwen+Hou&rft.au=Cheng-Lin+Liu&rft.date=2010-08-01&rft.pub=IEEE&rft.isbn=9781424475421&rft.issn=1051-4651&rft.spage=65&rft.epage=68&rft_id=info:doi/10.1109%2FICPR.2010.25&rft.externalDocID=5597629
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1051-4651&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1051-4651&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1051-4651&client=summon