Sign-to-speech translation using machine-learning-assisted stretchable sensor arrays

Signed languages are not as pervasive a conversational medium as spoken languages due to the history of institutional suppression of the former and the linguistic hegemony of the latter. This has led to a communication barrier between signers and non-signers that could be mitigated by technology-med...

Full description

Saved in:
Bibliographic Details
Published inNature electronics Vol. 3; no. 9; pp. 571 - 578
Main Authors Zhou, Zhihao, Chen, Kyle, Li, Xiaoshi, Zhang, Songlin, Wu, Yufen, Zhou, Yihao, Meng, Keyu, Sun, Chenchen, He, Qiang, Fan, Wenjing, Fan, Endong, Lin, Zhiwei, Tan, Xulong, Deng, Weili, Yang, Jin, Chen, Jun
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group 01.09.2020
Subjects
Online AccessGet full text
ISSN2520-1131
DOI10.1038/s41928-020-0428-6

Cover

More Information
Summary:Signed languages are not as pervasive a conversational medium as spoken languages due to the history of institutional suppression of the former and the linguistic hegemony of the latter. This has led to a communication barrier between signers and non-signers that could be mitigated by technology-mediated approaches. Here, we show that a wearable sign-to-speech translation system, assisted by machine learning, can accurately translate the hand gestures of American Sign Language into speech. The wearable sign-to-speech translation system is composed of yarn-based stretchable sensor arrays and a wireless printed circuit board, and offers a high sensitivity and fast response time, allowing real-time translation of signs into spoken words to be performed. By analysing 660 acquired sign language hand gesture recognition patterns, we demonstrate a recognition rate of up to 98.63% and a recognition time of less than 1 s.Wearable yarn-based stretchable sensor arrays, combined with machine learning, can be used to translate American Sign Language into speech in real time.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2520-1131
DOI:10.1038/s41928-020-0428-6