Sign Language Finger Alphabet Recognition from Gabor-PCA Representation of Hand Gestures

During recent years a large number of computer aided applications have been developed to help the disabled people. This has improved the communication between the able and the hearing impaired community. An intelligent signed alphabet recognizer can work as an aiding agent to translate the signs to...

Full description

Saved in:
Bibliographic Details
Published in2007 International Conference on Machine Learning and Cybernetics Vol. 4; pp. 2218 - 2223
Main Authors Amin, M.A., Hong Yan
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.08.2007
Subjects
Online AccessGet full text
ISBN1424409721
9781424409723
ISSN2160-133X
DOI10.1109/ICMLC.2007.4370514

Cover

More Information
Summary:During recent years a large number of computer aided applications have been developed to help the disabled people. This has improved the communication between the able and the hearing impaired community. An intelligent signed alphabet recognizer can work as an aiding agent to translate the signs to words (and also sentences) and vice versa. To achieve this goal few steps to be followed, among which the first complicated task is to recognize the sign-language alphabets from hand gesture images. In this paper, we propose a system that is able to recognize American Sign Language (ASL) alphabets from hand gesture with average 93.23% accuracy. The classification is performed with fuzzy-c-mean clustering on a lower dimensional data which is acquired from the Principle Component Analysis (PCA) of Gabor representation of hand gesture images. Out of the top 20 Principle Components (PCs) the best combination of PCs is determined by finding the best fuzzy cluster for the corresponding PCs of the training data. The best result is obtained from the combination of the fourth to seventh principle components.
ISBN:1424409721
9781424409723
ISSN:2160-133X
DOI:10.1109/ICMLC.2007.4370514