Matrix-Based Discriminant Subspace Ensemble for Hyperspectral Image Spatial-Spectral Feature Fusion

Spatial-spectral feature fusion is well acknowledged as an effective method for hyperspectral (HS) image classification. Many previous studies have been devoted to this subject. However, these methods often regard the spatial-spectral high-dimensional data as 1-D vector and then extract informative...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 54; no. 2; pp. 783 - 794
Main Authors Hang, Renlong, Liu, Qingshan, Song, Huihui, Sun, Yubao
Format Journal Article
LanguageEnglish
Published New York IEEE 01.02.2016
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0196-2892
1558-0644
DOI10.1109/TGRS.2015.2465899

Cover

More Information
Summary:Spatial-spectral feature fusion is well acknowledged as an effective method for hyperspectral (HS) image classification. Many previous studies have been devoted to this subject. However, these methods often regard the spatial-spectral high-dimensional data as 1-D vector and then extract informative features for classification. In this paper, we propose a new HS image classification method. Specifically, matrix-based spatial-spectral feature representation is designed for each pixel to capture the local spatial contextual and the spectral information of all the bands, which can well preserve the spatial-spectral correlation. Then, matrix-based discriminant analysis is adopted to learn the discriminative feature subspace for classification. To further improve the performance of discriminative subspace, a random sampling technique is used to produce a subspace ensemble for final HS image classification. Experiments are conducted on three HS remote sensing data sets acquired by different sensors, and experimental results demonstrate the efficiency of the proposed method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2015.2465899