SUREHYP: An Open Source Python Package for Preprocessing Hyperion Radiance Data and Retrieving Surface Reflectance

Surface reflectance is an essential product from remote sensing Earth observations critical for a wide variety of applications, including consistent land cover mapping and change, and estimation of vegetation attributes. From 2000 to 2017 the Earth Observing-1 Hyperion instrument acquired the first...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 22; no. 23; p. 9205
Main Authors Miraglio, Thomas, Coops , Nicholas C.
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 26.11.2022
MDPI
Subjects
Online AccessGet full text
ISSN1424-8220
1424-8220
DOI10.3390/s22239205

Cover

More Information
Summary:Surface reflectance is an essential product from remote sensing Earth observations critical for a wide variety of applications, including consistent land cover mapping and change, and estimation of vegetation attributes. From 2000 to 2017 the Earth Observing-1 Hyperion instrument acquired the first satellite based hyperspectral image archive from space resulting in over 83,138 publicly available images. Hyperion imagery however requires significant preprocessing to derive surface reflectance. SUREHYP is a Python package designed to process batches of Hyperion images, bringing together a number of published algorithms and methods to correct at sensor radiance and derive surface reflectance. In this paper, we present the SUREHYP workflow and demonstrate its application on Hyperion imagery. Results indicate SUREHYP produces flat terrain surface reflectance results comparable to commercially available software, with reflectance values for the whole spectral range almost entirely within 10% of the software’s over a reference target, yet it is publicly available and open source, allowing the exploitation of this valuable hyperspectral archive on a global scale.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s22239205