Classifying the Human Activities of Sensor Data Using Deep Neural Network
Today sensors represent one of the most important applications for generating data stream. This data has a number of unique characteristics, including fast data access, huge volume, as well as the most prominent feature, the concept drift. Machine learning in general and deep learning technique in p...
Saved in:
Published in | Intelligent Systems and Pattern Recognition Vol. 1589; pp. 107 - 118 |
---|---|
Main Authors | , , |
Format | Book Chapter |
Language | English |
Published |
Switzerland
Springer International Publishing AG
2022
Springer International Publishing |
Series | Communications in Computer and Information Science |
Subjects | |
Online Access | Get full text |
ISBN | 9783031082764 3031082761 |
ISSN | 1865-0929 1865-0937 |
DOI | 10.1007/978-3-031-08277-1_9 |
Cover
Summary: | Today sensors represent one of the most important applications for generating data stream. This data has a number of unique characteristics, including fast data access, huge volume, as well as the most prominent feature, the concept drift. Machine learning in general and deep learning technique in particular is among the predominant and successful selections to classify the human activities. This is due to several reasons such as results quality and processing time. The recognition of human activities that produced from sensors considers is an effective and vital task in the healthcare field, meanwhile, it is an attractive to researchers. This paper presents a DNN model to classify the human activities of the HuGaDB sensor dataset by implementing multilayer perceptron (MLP) structure. The current model achieved results, 91.7% of accuracy, 92.5% precision, 92.0% recall, and 92.0% of F1-score, using a tiny time. The model results were compared with the previous models and it has proven its efficiency by outperforming those models. |
---|---|
Bibliography: | Supported by Babylon University. |
ISBN: | 9783031082764 3031082761 |
ISSN: | 1865-0929 1865-0937 |
DOI: | 10.1007/978-3-031-08277-1_9 |