Fall detection algorithm based on pyramid network and feature fusion

Accidental falls are the second leading cause of accidental death of the elderly. Early intervention measures can reduce the problem. However, so far, there are few related studies using Transformer coding module for fall detection feature extraction, and the real-time performance of existing algori...

Full description

Saved in:
Bibliographic Details
Published inEvolving systems Vol. 15; no. 5; pp. 1957 - 1970
Main Authors Li, Jiangjiao, Gao, Mengqi, Wang, Peng, Li, Bin
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.10.2024
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1868-6478
1868-6486
DOI10.1007/s12530-024-09601-9

Cover

More Information
Summary:Accidental falls are the second leading cause of accidental death of the elderly. Early intervention measures can reduce the problem. However, so far, there are few related studies using Transformer coding module for fall detection feature extraction, and the real-time performance of existing algorithms is not so good. Therefore, we propose a fall detection method based on Transformer to extract spatiotemporal features. Specifically, we use an image reduction module based on a convolutional neural network to reduce the image size for computation. Then, we design a pyramid network based on an improved Transformer to extract spatial features. Finally, we design a feature fusion module that fuses spatial features of different scales. The fused features are input into the gate recurrent unit to extract time features and complete the recognition of falls and normal postures. Experimental results show that the proposed approach achieves an accuracy of 99.61% and 99.33% when tested with UR Fall Detection Dataset and Le2i Fall Detection Dataset. Compared with the state-of-the-art fall detection algorithms, our method has high accuracy while maintaining high detection speed.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1868-6478
1868-6486
DOI:10.1007/s12530-024-09601-9