Dimensionality reduction based on ICA for regression problems

In manipulating data such as in supervised learning, we often extract new features from the original input variables for the purpose of reducing the dimensions of input space and achieving better performances. In this paper, we show how standard algorithms for independent component analysis (ICA) ca...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing (Amsterdam) Vol. 71; no. 13; pp. 2596 - 2603
Main Authors Kwak, Nojun, Kim, Chunghoon, Kim, Hwangnam
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.08.2008
Subjects
Online AccessGet full text
ISSN0925-2312
1872-8286
DOI10.1016/j.neucom.2007.11.036

Cover

More Information
Summary:In manipulating data such as in supervised learning, we often extract new features from the original input variables for the purpose of reducing the dimensions of input space and achieving better performances. In this paper, we show how standard algorithms for independent component analysis (ICA) can be extended to extract attributes for regression problems. The advantage is that general ICA algorithms become available to a task of dimensionality reduction for regression problems by maximizing the joint mutual information between target variable and new attributes. We applied the proposed method to a couple of real world regression problems as well as some artificial problems and compared the performances with those of other conventional methods. Experimental results show that the proposed method can efficiently reduce the dimension of input space without degrading the regression performance.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2007.11.036