Using Generalization Error Bounds to Train the Set Covering Machine

In this paper we eliminate the need for parameter estimation associated with the set covering machine (SCM) by directly minimizing generalization error bounds. Firstly, we consider a sub-optimal greedy heuristic algorithm termed the bound set covering machine (BSCM). Next, we propose the branch and...

Full description

Saved in:
Bibliographic Details
Published inNeural Information Processing Vol. 4984; pp. 258 - 268
Main Authors Hussain, Zakria, Shawe-Taylor, John
Format Book Chapter
LanguageEnglish
Published Germany Springer Berlin / Heidelberg 2008
Springer Berlin Heidelberg
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN3540691545
9783540691549
ISSN0302-9743
1611-3349
DOI10.1007/978-3-540-69158-7_28

Cover

More Information
Summary:In this paper we eliminate the need for parameter estimation associated with the set covering machine (SCM) by directly minimizing generalization error bounds. Firstly, we consider a sub-optimal greedy heuristic algorithm termed the bound set covering machine (BSCM). Next, we propose the branch and bound set covering machine (BBSCM) and prove that it finds a classifier producing the smallest generalization error bound. We further justify empirically the BBSCM algorithm with a heuristic relaxation, called BBSCM(τ), which guarantees a solution whose bound is within a factor τ of the optimal. Experiments comparing against the support vector machine (SVM) and SCM algorithms demonstrate that the approaches proposed can lead to some or all of the following: 1) faster running times, 2) sparser classifiers and 3) competitive generalization error, all while avoiding the need for parameter estimation.
ISBN:3540691545
9783540691549
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-540-69158-7_28