An improved space frequency joint passive azimuth tracking method for underwater targets

In the field of azimuth tracking of the underwater target, the spatial and frequency joint track-before-detect methods are widely used. Among them, estimating the frequency and azimuth states of line spectrum signals in the time series of frequency azimuth (FRAZ) spectrum is considered a direct and...

Full description

Saved in:
Bibliographic Details
Published inJournal of physics. Conference series Vol. 3007; no. 1; pp. 12062 - 12066
Main Authors Shi, Yunjia, Piao, Shengchun, Guo, Junyuan
Format Journal Article
LanguageEnglish
Published Bristol IOP Publishing 01.05.2025
Subjects
Online AccessGet full text
ISSN1742-6588
1742-6596
1742-6596
DOI10.1088/1742-6596/3007/1/012062

Cover

More Information
Summary:In the field of azimuth tracking of the underwater target, the spatial and frequency joint track-before-detect methods are widely used. Among them, estimating the frequency and azimuth states of line spectrum signals in the time series of frequency azimuth (FRAZ) spectrum is considered a direct and practical approach. However, existing processing methods generally face the challenge of high computational complexity. To deal with this problem, this paper proposed an improved method. This method transforms the FRAZ spectrum into a spectrum of azimuth-azimuth variation with fewer cells. It achieves compensation for changes in azimuth and frequency caused by target motion to some extent by taking the maximum value after multiplying the shifted spectra of different azimuths at adjoining times. Based on this transformation, a hid-den Markov model (HMM) with azimuth and azimuth variation as states is established, and the Viterbi algorithm is used to track the azimuth of underwater acoustic targets iteratively. The results of processing the simulation data demonstrate that the proposed method significantly re-duces computational complexity and improves tracking stability.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1742-6588
1742-6596
1742-6596
DOI:10.1088/1742-6596/3007/1/012062