Idle-state detection in motor imagery of articulation using early information: A functional Near-infrared spectroscopy study

•A new two-class asynchronous functional near-infrared spectroscopy (fNIRS)-based brain computer interface (BCI) that can detect idle state robustly within a reduced time window in the context of motor imagery of articulation (MIA) is described.•A reduced time window (0–2.5 s) improves the informati...

Full description

Saved in:
Bibliographic Details
Published inBiomedical signal processing and control Vol. 72; p. 103369
Main Authors Guo, Zengzhi, Chen, Fei
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.02.2022
Subjects
Online AccessGet full text
ISSN1746-8094
1746-8108
DOI10.1016/j.bspc.2021.103369

Cover

More Information
Summary:•A new two-class asynchronous functional near-infrared spectroscopy (fNIRS)-based brain computer interface (BCI) that can detect idle state robustly within a reduced time window in the context of motor imagery of articulation (MIA) is described.•A reduced time window (0–2.5 s) improves the information transfer rate of an fNIRS-based BCI.•Time domain, spatial domain, and connectivity features are analyzed and compared for detecting idle-state in an MIA context.•Compensative information derived from activation, spatial, and information propagation patterns can improve idle-state detection during the performance of MIA. A speech imagery-based brain computer interface (BCI) provides an alternative way for people to interact with the outside world intuitively. Most speech imagery BCIs based on functional near-infrared spectroscopy (fNIRS) are disadvantageous in applications outside the laboratory due to being incapable of detecting asynchronous (self-paced) actions. This work aimed to develop a two-class asynchronous BCI that detects idle state in the context of motor imagery of articulation (MIA). In this study, 19 healthy subjects were asked to rehearse the Chinese vowels/a/and/u/covertly and to accommodate a rest state. A feature selection strategy was designed to combine time domain, spatial domain, and functional connectivity features. All differentiating information was extracted from fNIRS signals in a 0–2.5 s time window. Among single-modality features, the centrality features of brain networks performed better than any other features and yielded a subject-dependent classification accuracy rate of 75.1%. The subject-dependent classification accuracy rate of the combined features at 78.9% was significantly better than that of any single feature. These results demonstrate that it is feasible to detect idle states from active MIA states in a reduced time window size. The proposed combination of information propagation pattern, spatial pattern, and activation pattern was meaningful for extracting discriminative information, hence improving classification accuracy. The high classification accuracy and fast information transfer rate of the presented BCI show promising practicality for real world applications.
ISSN:1746-8094
1746-8108
DOI:10.1016/j.bspc.2021.103369