Brain-Controlled Wheelchair Navigation Using Visual Perception in Simulated Indoor Environments
Wheelchair navigation based on the Brain-Computer Interface (BCI) has gained significant attention due to its ease of use and offering support to people with severe paralysis in mobility. Enhancing the comfort of such people to use wheelchairs with more efficiency is essential to make it a practical...
Saved in:
Published in | 2024 4th International Conference on Electrical Engineering (EECon) pp. 101 - 106 |
---|---|
Main Authors | , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
12.12.2024
|
Subjects | |
Online Access | Get full text |
DOI | 10.1109/EECon64470.2024.10841863 |
Cover
Summary: | Wheelchair navigation based on the Brain-Computer Interface (BCI) has gained significant attention due to its ease of use and offering support to people with severe paralysis in mobility. Enhancing the comfort of such people to use wheelchairs with more efficiency is essential to make it a practical application. In this research endeavor, a novel system based on Steady State Visually Evoked Potential (SSVEP) is introduced to autonomously guide a wheelchair from its initial location to a predetermined destination point. In this research, four distinct flickering frequencies serve as visual stimuli to represent specific locations within a simulated indoor environment. Electroencephalography (EEG) signals are acquired using a set of six electrodes, and signal preprocessing involves Common Average Referencing (CAR). To classify the EEG signals, a novel approach combines Fast Fourier Transform (FFT) and Power Spectrum Density (PSD) analyses, along with peak detection and thresholding techniques. The experiments were conducted in real-time, involving five subjects, each participating in six trials for every distinct frequency. Remarkably, the algorithm achieved an accuracy rate exceeding 90% for each frequency without prior training. Navigation based on the classified SSVEP frequencies was successfully executed within a simulation environment, using the Robot Operating System (ROS2), resulting in favorable outcomes. This promising result suggests the potential for future implementations through the integration of appropriate hardware and sensors. |
---|---|
DOI: | 10.1109/EECon64470.2024.10841863 |