A novel scheme based on information theory and transfer learning for multi classes motor imagery decoding
The most important challenges of classifying Motor Imagery tasks based on the EEG signal are low signal‐to‐noise ratio, non‐stationarity, and the high subject dependence of the EEG signal. In this study, a framework for multi‐class decoding of Motor Imagery signals is presented. This framework is ba...
Saved in:
Published in | IET signal processing Vol. 17; no. 5 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Wiley
01.05.2023
|
Subjects | |
Online Access | Get full text |
ISSN | 1751-9675 1751-9683 1751-9683 |
DOI | 10.1049/sil2.12222 |
Cover
Summary: | The most important challenges of classifying Motor Imagery tasks based on the EEG signal are low signal‐to‐noise ratio, non‐stationarity, and the high subject dependence of the EEG signal. In this study, a framework for multi‐class decoding of Motor Imagery signals is presented. This framework is based on information theory and hybrid deep learning along with transfer learning. In this study, the OVR‐FBDiv method, which is based on the symmetric Kullback—Leibler divergence, is used to differentiate between features of different classes and highlight them. Then, the mRMR algorithm is used to select the most distinctive features obtained from the filters of symmetric KL divergence. Finally, a hybrid deep neural network consisting of CNN and LSTM is used to learn the spatial and temporal features of the EEG signal along with the transfer learning technique to overcome the problem of subject dependence in EEG signals. The average value of Kappa for the classification of 4‐class Motor Imagery data on BCI competition IV dataset 2a by the proposed method is 0.84. Also, the proposed method is compared with other state‐of‐the‐art methods.
The goal of this paper is to provide a framework based on information theory and hybrid deep learning model to respond to most challenges of a BCI system with a Motor Imagery paradigm. |
---|---|
ISSN: | 1751-9675 1751-9683 1751-9683 |
DOI: | 10.1049/sil2.12222 |