Feature Construction and Dimension Reduction Using Genetic Programming

This paper describes a new approach to the use of genetic programming (GP) for feature construction in classification problems. Rather than wrapping a particular classifier for single feature construction as in most of the existing methods, this approach uses GP to construct multiple (high-level) fe...

Full description

Saved in:
Bibliographic Details
Published inAI 2007: Advances in Artificial Intelligence Vol. 4830; pp. 160 - 170
Main Authors Neshatian, Kourosh, Zhang, Mengjie, Johnston, Mark
Format Book Chapter
LanguageEnglish
Published Germany Springer Berlin / Heidelberg 2007
Springer Berlin Heidelberg
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783540769262
3540769269
ISSN0302-9743
1611-3349
DOI10.1007/978-3-540-76928-6_18

Cover

More Information
Summary:This paper describes a new approach to the use of genetic programming (GP) for feature construction in classification problems. Rather than wrapping a particular classifier for single feature construction as in most of the existing methods, this approach uses GP to construct multiple (high-level) features from the original features. These constructed features are then used by decision trees for classification. As feature construction is independent of classification, the fitness function is designed based on the class dispersion and entropy. This approach is examined and compared with the standard decision tree method, using the original features, and using a combination of the original features and constructed features, on 12 benchmark classification problems. The results show that the new approach outperforms the standard way of using decision trees on these problems in terms of the classification performance, dimension reduction and the learned decision tree size.
ISBN:9783540769262
3540769269
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-540-76928-6_18