Bayesian Fusion of Unlabeled Vision and RF Data for Aerial Tracking of Ground Targets

This paper presents a method for target localization and tracking in clutter using Bayesian fusion of vision and Radio Frequency (RF) sensors used aboard a small Unmanned Aircraft System (sUAS). Sensor fusion is used to ensure tracking robustness and reliability in case of camera occlusion or RF sig...

Full description

Saved in:
Bibliographic Details
Published inProceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 1629 - 1636
Main Authors Rajasekaran, Ramya Kanlapuli, Ahmed, Nisar, Frew, Eric
Format Conference Proceeding
LanguageEnglish
Published IEEE 24.10.2020
Subjects
Online AccessGet full text
ISSN2153-0866
DOI10.1109/IROS45743.2020.9341410

Cover

More Information
Summary:This paper presents a method for target localization and tracking in clutter using Bayesian fusion of vision and Radio Frequency (RF) sensors used aboard a small Unmanned Aircraft System (sUAS). Sensor fusion is used to ensure tracking robustness and reliability in case of camera occlusion or RF signal interference. Camera data is processed using an off-the-shelf algorithm that detects possible objects of interest in a given image frame, and the true RF emitting target needs to be identified from among these if it is present. These data sources, as well as the unknown motion of the target, lead to a heavily non-linear non-Gaussian target state uncertainties, which are not amenable to typical data association methods for tracking. A probabilistic model is thus first rigorously developed to relate conditional dependencies between target movements, RF data and visual object detections. A modified particle filter is then developed to simultaneously reason over target states and RF emitter association hypothesis labels for visual object detections. Truth model simulations are presented to compare and validate the effectiveness of the RF + visual data fusion filter.
ISSN:2153-0866
DOI:10.1109/IROS45743.2020.9341410