Applying Hybrid Deep Neural Network for the Recognition of Sign Language Words Used by the Deaf COVID-19 Patients

The rapid spread of the novel corona virus disease (COVID-19) has disrupted the traditional clinical services all over the world. Hospitals and healthcare centers have taken extreme care to minimize the risk of exposure to the virus by restricting the visitors and relatives of the patients. The dram...

Full description

Saved in:
Bibliographic Details
Published inArabian journal for science and engineering Vol. 48; no. 2; pp. 1349 - 1362
Main Authors Venugopalan, Adithya, Reghunadhan, Rajesh
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.02.2023
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN2193-567X
1319-8025
2191-4281
2191-4281
DOI10.1007/s13369-022-06843-0

Cover

More Information
Summary:The rapid spread of the novel corona virus disease (COVID-19) has disrupted the traditional clinical services all over the world. Hospitals and healthcare centers have taken extreme care to minimize the risk of exposure to the virus by restricting the visitors and relatives of the patients. The dramatic changes happened in the healthcare norms have made it hard for the deaf patients to communicate and receive appropriate care. This paper reports a work on automatic sign language recognition that can mitigate the communication barrier between the deaf patients and the healthcare workers in India. Since hand gestures are the most expressive components of a sign language vocabulary, a novel dataset of dynamic hand gestures for the Indian sign language (ISL) words commonly used for emergency communication by deaf COVID-19 positive patients is proposed. A hybrid model of deep convolutional long short-term memory network has been utilized for the recognition of the proposed hand gestures and achieved an average accuracy of 83.36%. The model performance has been further validated on an alternative ISL dataset as well as a benchmarking hand gesture dataset and obtained average accuracies of 97 % and 99.34 ± 0.66 % , respectively.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2193-567X
1319-8025
2191-4281
2191-4281
DOI:10.1007/s13369-022-06843-0