Nonoverlapping Spectral Ranges’ Hyperspectral Data Fusion Based on Combined Spectral Unmixing
Due to the development of spectral remote sensing imaging technology, hyperspectral data in different spectral ranges, such as visible and near-infrared, short-wave infrared, etc., can be acquired simultaneously. Data fusion between these nonoverlapping spectral ranges’ hyperspectral data has become...
Saved in:
Published in | Remote sensing (Basel, Switzerland) Vol. 17; no. 4; p. 666 |
---|---|
Main Authors | , , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Basel
MDPI AG
01.02.2025
|
Subjects | |
Online Access | Get full text |
ISSN | 2072-4292 2072-4292 |
DOI | 10.3390/rs17040666 |
Cover
Summary: | Due to the development of spectral remote sensing imaging technology, hyperspectral data in different spectral ranges, such as visible and near-infrared, short-wave infrared, etc., can be acquired simultaneously. Data fusion between these nonoverlapping spectral ranges’ hyperspectral data has become an urgent task. Most existing hyperspectral data fusion methods focus on two types of hyperspectral data with overlapping spectral ranges, requiring spectral response functions as a necessary condition, which is not applicable to this task. To address this issue, we propose the combined spectral unmixing fusion (CSUF) method, an unsupervised method with certain physical significance. It effectively solves the problem of hyperspectral data fusion with nonoverlapping spectral ranges through the two hyperspectral data point spread function estimation and combined spectral unmixing. Experiments on airborne datasets and HJ-2 satellite data show that, compared with various leading methods, our method achieves the best performance in terms of reference evaluation indicators such as the PSNR and SAM, as well as the non-reference evaluation indicator the QNR. Furthermore, we deeply analyze the spectral response relationship and the impact of the ratio of spectral bands between the fused data on the fusion effect, providing references for future research. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 2072-4292 2072-4292 |
DOI: | 10.3390/rs17040666 |