Drone-Person Tracking in Uniform Appearance Crowd: A New Dataset

Drone-person tracking in uniform appearance crowds poses unique challenges due to the difficulty in distinguishing individuals with similar attire and multi-scale variations. To address this issue and facilitate the development of effective tracking algorithms, we present a novel dataset named D-PTU...

Full description

Saved in:
Bibliographic Details
Published inScientific data Vol. 11; no. 1; pp. 15 - 21
Main Authors Alansari, Mohamad, Abdul Hay, Oussama, Alansari, Sara, Javed, Sajid, Shoufan, Abdulhadi, Zweiri, Yahya, Werghi, Naoufel
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 02.01.2024
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text
ISSN2052-4463
2052-4463
DOI10.1038/s41597-023-02810-y

Cover

More Information
Summary:Drone-person tracking in uniform appearance crowds poses unique challenges due to the difficulty in distinguishing individuals with similar attire and multi-scale variations. To address this issue and facilitate the development of effective tracking algorithms, we present a novel dataset named D-PTUAC (Drone-Person Tracking in Uniform Appearance Crowd). The dataset comprises 138 sequences comprising over 121 K frames, each manually annotated with bounding boxes and attributes. During dataset creation, we carefully consider 18 challenging attributes encompassing a wide range of viewpoints and scene complexities. These attributes are annotated to facilitate the analysis of performance based on specific attributes. Extensive experiments are conducted using 44 state-of-the-art (SOTA) trackers, and the performance gap between the visual object trackers on existing benchmarks compared to our proposed dataset demonstrate the need for a dedicated end-to-end aerial visual object tracker that accounts the inherent properties of aerial environment.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Undefined-1
ObjectType-Feature-3
content type line 23
ISSN:2052-4463
2052-4463
DOI:10.1038/s41597-023-02810-y