Classifying the Human Activities of Sensor Data Using Deep Neural Network

Today sensors represent one of the most important applications for generating data stream. This data has a number of unique characteristics, including fast data access, huge volume, as well as the most prominent feature, the concept drift. Machine learning in general and deep learning technique in p...

Full description

Saved in:
Bibliographic Details
Published inIntelligent Systems and Pattern Recognition Vol. 1589; pp. 107 - 118
Main Authors Al-Khamees, Hussein A. A., Al-A’araji, Nabeel, Al-Shamery, Eman S.
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2022
Springer International Publishing
SeriesCommunications in Computer and Information Science
Subjects
Online AccessGet full text
ISBN9783031082764
3031082761
ISSN1865-0929
1865-0937
DOI10.1007/978-3-031-08277-1_9

Cover

More Information
Summary:Today sensors represent one of the most important applications for generating data stream. This data has a number of unique characteristics, including fast data access, huge volume, as well as the most prominent feature, the concept drift. Machine learning in general and deep learning technique in particular is among the predominant and successful selections to classify the human activities. This is due to several reasons such as results quality and processing time. The recognition of human activities that produced from sensors considers is an effective and vital task in the healthcare field, meanwhile, it is an attractive to researchers. This paper presents a DNN model to classify the human activities of the HuGaDB sensor dataset by implementing multilayer perceptron (MLP) structure. The current model achieved results, 91.7% of accuracy, 92.5% precision, 92.0% recall, and 92.0% of F1-score, using a tiny time. The model results were compared with the previous models and it has proven its efficiency by outperforming those models.
Bibliography:Supported by Babylon University.
ISBN:9783031082764
3031082761
ISSN:1865-0929
1865-0937
DOI:10.1007/978-3-031-08277-1_9