Sum-difference Reduced-dimensional Processing in SAR Image Domain and Statistical Analysis for Ground Moving Target Indication

In this article, a new reduced-dimensional adaptive processing algorithm based on joint pixels sum-difference data for clutter rejection is proposed. The sum-difference data are obtained by orthogonal projection of the joint pixels data of different synthetic aperture radar (SAR) images generated by...

Full description

Saved in:
Bibliographic Details
Published inChinese journal of aeronautics Vol. 22; no. 6; pp. 620 - 626
Main Authors Zhiwei, Yang, Guisheng, Liao, Juan, Zhang, Cao, Zeng
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.12.2009
Subjects
Online AccessGet full text
ISSN1000-9361
2588-9230
DOI10.1016/S1000-9361(08)60150-8

Cover

More Information
Summary:In this article, a new reduced-dimensional adaptive processing algorithm based on joint pixels sum-difference data for clutter rejection is proposed. The sum-difference data are obtained by orthogonal projection of the joint pixels data of different synthetic aperture radar (SAR) images generated by a multi-satellite radar system. In the sense of statistical expectation, the sum-difference data contain the common and different information of the SAR images. Therefore, the objective of clutter cancellation can be achieved by adaptive processing. Moreover, based on the residual image after clutter rejection, statistical analysis of constant false-alarm rate (CFAR) detection of moving targets is also presented. Simulation results demonstrate the effectiveness and robustness of the proposed algorithm even with heterogeneous clutter and image co-registration error.
Bibliography:O212
clutter rejection
multi-channel
synthetic aperture radar
space-time adaptive processing
TN958
ground moving target indication
11-1732/V
ground moving target indication; space-time adaptive processing; clutter rejection; multi-channel; synthetic aperture radar
ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1000-9361
2588-9230
DOI:10.1016/S1000-9361(08)60150-8