Ensemble Methods
This chapter discusses the mechanics of the most popular ensemble methods. It explains that ensemble methods consist of a hierarchy of two algorithms. Ensemble methods train hundreds or thousands of the low‐level algorithms called base learners. The higher‐level algorithm controls the training of th...
        Saved in:
      
    
          | Published in | Machine Learning with Spark and Python pp. 1 - 2 | 
|---|---|
| Main Author | |
| Format | Book Chapter | 
| Language | English | 
| Published | 
        United States
          John Wiley & Sons
    
        2020
     John Wiley & Sons, Incorporated John Wiley & Sons, Inc  | 
| Edition | 2nd Edition | 
| Subjects | |
| Online Access | Get full text | 
| ISBN | 1119561930 9781119561934  | 
| DOI | 10.1002/9781119562023.ch6 | 
Cover
| Summary: | This chapter discusses the mechanics of the most popular ensemble methods. It explains that ensemble methods consist of a hierarchy of two algorithms. Ensemble methods train hundreds or thousands of the low‐level algorithms called base learners. The higher‐level algorithm controls the training of the base learners in order that their models turn out somewhat independent from one another so that combining them will reduce the variance of the combination. Several upper‐level algorithms are widely used. They are bagging, boosting, and random forest. The chapter shows how gradient boosting operates and demonstrates how to control its behavior to get the best performance. The packages available for doing gradient boosting in Python permit you to use random forest base learners with gradient boosting. The chapter explains each of the high‐level algorithms and shows a facsimile of the random forest's base learners. It discusses how to measure and regulate overfitting with binary trees. | 
|---|---|
| ISBN: | 1119561930 9781119561934  | 
| DOI: | 10.1002/9781119562023.ch6 |