Shrink boost for selecting multi-LBP histogram features in object detection

Feature selection from sparse and high dimension features using conventional greedy based boosting gives classifiers of poor generalization. We propose a novel "shrink boost" method to address this problem. It solves a sparse regularization problem with two iterative steps. First, a "...

Full description

Saved in:
Bibliographic Details
Published in2012 IEEE Conference on Computer Vision and Pattern Recognition pp. 3250 - 3257
Main Authors Cher Keng Heng, Yokomitsu, S., Matsumoto, Y., Tamura, H.
Format Conference Proceeding
LanguageEnglish
Japanese
Published IEEE 01.06.2012
Subjects
Online AccessGet full text
ISBN9781467312264
1467312266
ISSN1063-6919
1063-6919
DOI10.1109/CVPR.2012.6248061

Cover

More Information
Summary:Feature selection from sparse and high dimension features using conventional greedy based boosting gives classifiers of poor generalization. We propose a novel "shrink boost" method to address this problem. It solves a sparse regularization problem with two iterative steps. First, a "boosting" step uses weighted training samples to learn a full high dimensional classifier on all features. This avoids over fitting to few features and improves generalization. Next, a "shrinkage" step shrinks least discriminative classifier dimension to zero to remove the redundant features. In our object detection system, we use "shrink boost" to select sparse features from histograms of local binary pattern (LBP) of multiple quantization and image channels to learn classifier of additive lookup tables (LUT). Our evaluation shows that our classifier has much better generalization than those from greedy based boosting and those from SVM methods, even under limited number of train samples. On public dataset of human detection and pedestrian detection, we achieve better performance than state of the arts. On our more challenging dataset of bird detection, we show promising results.
ISBN:9781467312264
1467312266
ISSN:1063-6919
1063-6919
DOI:10.1109/CVPR.2012.6248061