Recognition of human activity using GRU deep learning algorithm

Human activity recognition (HAR) is a challenging issue in several fields, such as medical diagnosis. Recent advances in the accuracy of deep learning have contributed to solving the HAR issues. Thus, it is necessary to implement deep learning algorithms that have high performance and greater accura...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 82; no. 30; pp. 47733 - 47749
Main Author Mohsen, Saeed
Format Journal Article
LanguageEnglish
Published New York Springer US 01.12.2023
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1380-7501
1573-7721
1573-7721
DOI10.1007/s11042-023-15571-y

Cover

More Information
Summary:Human activity recognition (HAR) is a challenging issue in several fields, such as medical diagnosis. Recent advances in the accuracy of deep learning have contributed to solving the HAR issues. Thus, it is necessary to implement deep learning algorithms that have high performance and greater accuracy. In this paper, a gated recurrent unit (GRU) algorithm is proposed to classify human activities. This algorithm is applied to the Wireless Sensor Data Mining (WISDM) dataset gathered from many individuals with six classes of various activities – walking, sitting, downstairs, jogging, standing, and upstairs. The proposed algorithm is tested and trained via a hyper-parameter tuning method with TensorFlow framework to achieve high accuracy. Experiments are conducted to evaluate the performance of the GRU algorithm using receiver operating characteristic (ROC) curves and confusion matrices. The results demonstrate that the GRU algorithm provides high performance in the recognition of human activities. The GRU algorithm achieves a testing accuracy of 97.08%. The rate of testing loss for the GRU is 0.221, while the precision, sensitivity, and F1-score for the GRU are 97.11%, 97.09%, and 97.10%, respectively. Experimentally, the area under the ROC curves (AUC S ) is 100%.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1380-7501
1573-7721
1573-7721
DOI:10.1007/s11042-023-15571-y