Sarcopenia Recognition System Combined with Electromyography and Gait Obtained by the Multiple Sensor Module and Deep Learning Algorithm

At present, many diseases can be predicted through data obtained by wearable sensors. The majority of these proposed wearable devices only use inertial sensors to obtain the walking motion signals of a subject. However, since the symptoms of sarcopenia are reflected in the changes in human muscles,...

Full description

Saved in:
Bibliographic Details
Published inSensors and materials Vol. 34; no. 6; p. 2403
Main Authors Chen, I-Miao, Yeh, Pin-Yu, Chang, Ting-Chi, Hsieh, Ya-Chu, Chin, Chiun-Li
Format Journal Article
LanguageEnglish
Published Tokyo MYU Scientific Publishing Division 01.01.2022
Subjects
Online AccessGet full text
ISSN0914-4935
2435-0869
2435-0869
DOI10.18494/SAM3787

Cover

More Information
Summary:At present, many diseases can be predicted through data obtained by wearable sensors. The majority of these proposed wearable devices only use inertial sensors to obtain the walking motion signals of a subject. However, since the symptoms of sarcopenia are reflected in the changes in human muscles, we propose a sarcopenia recognition system, which consists of hardware and software. The hardware is composed of multiple sensor module (MSM), which is a wearable device used to collect the signals of electromyography and gait (EAG). The software is composed of biomedical and inertial sensors algorithm (Bodi algorithm) and leg health classification net (LCNet). The Bodi algorithm is used to calculate various gait indicators after predicting the risk of sarcopenia obtained by LCNet. The accuracy of LCNet is 94.41%, its precision is 91.58%, its specificity is 95.81%, and its sensitivity is 91.58%. In the future, we expect to use the proposed MSM to collect additional subjects' gait data and apply it to other disease predictions to assist physicians in disease diagnosis.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0914-4935
2435-0869
2435-0869
DOI:10.18494/SAM3787