Eyegaze Control Algorithm Using Machine Learning

Every year, most people lose a body part due to various types of accidents, which causes them to become disabled and unable to perform various activities on their own. Typically, disabled people require assistance in moving from one position to another so that they can work. Most people have numerou...

Full description

Saved in:
Bibliographic Details
Published inInternational Conference on Engineering and Emerging Technologies (Online) pp. 1 - 6
Main Authors Awan, Adeel Ahmed, Ahmed, Irfan
Format Conference Proceeding
LanguageEnglish
Published IEEE 27.10.2023
Subjects
Online AccessGet full text
ISSN2831-3682
DOI10.1109/ICEET60227.2023.10525870

Cover

More Information
Summary:Every year, most people lose a body part due to various types of accidents, which causes them to become disabled and unable to perform various activities on their own. Typically, disabled people require assistance in moving from one position to another so that they can work. Most people have numerous skills but are unable to perform any of them due to disability. There are a lot of people who are depressed because of their disability and have been fighting to get out of it for a long time. There should be some kind of solution for these disabled people, and Eye gaze Movement as a Remote is essentially providing a way to control. Eye tracking tells us where we look, what we ignore, and how the pupil reacts to different stimuli. The concept of eye tracking is simple, but the process and interpretation can be very diverse and complex. The goal of this project is to remove disabilities so that people can perform their tasks like normal people, and eye gaze is a way for disabled people to perform different tasks using their eyes movement. This report investigates a novel technique for pupil recognition. Image processing stages such as pre-processing, face recognition, and feature extraction are all involved. Finally, this eye detection system will be implemented and written in the "Python" programming language. This data can be analyzed, visualized, and interpreted. To increase the likelihood of success with this concept, we have integrated our software with a robotics car and moved it using these movements and interactions
ISSN:2831-3682
DOI:10.1109/ICEET60227.2023.10525870