Complex-Valued Sparse SAR-Image-Based Target Detection and Classification

It is known that synthetic aperture radar (SAR) images obtained by typical matched filtering (MF)-based algorithms always suffer from serious noise, sidelobes and clutter. However, the improvement in image quality means that the complexity of SAR systems will increase, which affects the applications...

Full description

Saved in:
Bibliographic Details
Published inRemote sensing (Basel, Switzerland) Vol. 14; no. 17; p. 4366
Main Authors Song, Chen, Deng, Jiarui, Liu, Zehao, Wang, Bingnan, Wu, Yirong, Bi, Hui
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.09.2022
Subjects
Online AccessGet full text
ISSN2072-4292
2072-4292
DOI10.3390/rs14174366

Cover

More Information
Summary:It is known that synthetic aperture radar (SAR) images obtained by typical matched filtering (MF)-based algorithms always suffer from serious noise, sidelobes and clutter. However, the improvement in image quality means that the complexity of SAR systems will increase, which affects the applications of SAR images. The introduction of sparse signal processing technologies into SAR imaging proposes a new way to solve this problem. Sparse SAR images obtained by sparse recovery algorithms show better image performance than typical complex SAR images with lower sidelobes and higher signal-to-noise ratios (SNR). As the most widely applied fields of SAR images, target detection and target classification rely on SAR images with high quality. Therefore, in this paper, a target detection framework based on sparse images recovered by complex approximate message passing (CAMP) algorithm and a novel classification network via sparse images reconstructed by the new iterative soft thresholding (BiIST) algorithm are proposed. Experimental results show that sparse SAR images have better performance whether for target classification or for target detection than the images recovered by MF-based algorithms, which validates the huge application potentials of sparse images.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2072-4292
2072-4292
DOI:10.3390/rs14174366