Fusion of multi-scale DEMs from Chang’E-3 descent and Navcam images using compressed sensing method

The multi-source digital elevation models (DEMs) generated using images acquired during Chang’e-3’s descent and landing phases and after landing contain supplementary information that allows a higher-quality DEM to be produced by fusing multi-scale DEMs. The proposed fusion method consists of three...

Full description

Saved in:
Bibliographic Details
Published inPlanetary Remote Sensing and Mapping pp. 241 - 250
Main Authors Peng, M., Wan, W., Liu, Z., Di, K.
Format Book Chapter
LanguageEnglish
Published CRC Press 2019
Edition1
Subjects
Online AccessGet full text
ISBN9781138584150
1138584150
DOI10.1201/9780429505997-16

Cover

More Information
Summary:The multi-source digital elevation models (DEMs) generated using images acquired during Chang’e-3’s descent and landing phases and after landing contain supplementary information that allows a higher-quality DEM to be produced by fusing multi-scale DEMs. The proposed fusion method consists of three steps. First, source DEMs are split into small DEM patches, which are classified into a few groups by local density peak clustering. Next, the grouped DEM patches are used for sub-dictionary learning by stochastic coordinate coding. The trained sub-dictionaries are combined to form a dictionary for sparse representation. Finally, the simultaneous orthogonal matching pursuit algorithm is used to achieve sparse representation. We use real DEMs generated from Chang’e-3 descent images and navigation camera stereo images to validate the proposed method. Through our experiments, we reconstruct a seamless DEM with the highest resolution and the broadest spatial coverage of all of the input data. The experimental results demonstrate the feasibility of the proposed method.
ISBN:9781138584150
1138584150
DOI:10.1201/9780429505997-16