AdaBoost.RT: a boosting algorithm for regression problems

A boosting algorithm, AdaBoost.RT, is proposed for regression problems. The idea is to filter out examples with a relative estimation error that is higher than the pre-set threshold value, and then follow the AdaBoost procedure. Thus it requires to select the sub-optimal value of relative error thre...

Full description

Saved in:
Bibliographic Details
Published in2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541) Vol. 2; pp. 1163 - 1168 vol.2
Main Authors Solomatine, D.P., Shrestha, D.L.
Format Conference Proceeding
LanguageEnglish
Published Piscataway NJ IEEE 2004
Subjects
Online AccessGet full text
ISBN0780383591
9780780383593
ISSN1098-7576
DOI10.1109/IJCNN.2004.1380102

Cover

More Information
Summary:A boosting algorithm, AdaBoost.RT, is proposed for regression problems. The idea is to filter out examples with a relative estimation error that is higher than the pre-set threshold value, and then follow the AdaBoost procedure. Thus it requires to select the sub-optimal value of relative error threshold to demarcate predictions from the predictor as correct or incorrect. Some experimental results using the M5 model tree as a weak learning machine for benchmark data sets and for hydrological modeling are reported, and compared to other boosting methods, bagging and artificial neural networks, and to a single M5 model tree. AdaBoost.Rt is proved to perform better on most of the considered data sets.
ISBN:0780383591
9780780383593
ISSN:1098-7576
DOI:10.1109/IJCNN.2004.1380102