Cross-subject federated transfer learning with quanvolutional layer for Motor Imagery classification
The Brain-Computer Interface (BCI) systems play an important role in the Rehabilitation therapy, Smart-home, Intelligent Transportation fields. To the best of our knowledge, different from the cross-trial and cross-run tasks, the data privacy of the huge amount of datasets from multiple subjects pre...
        Saved in:
      
    
          | Published in | Chinese Automation Congress (Online) pp. 5736 - 5741 | 
|---|---|
| Main Authors | , , , , , , | 
| Format | Conference Proceeding | 
| Language | English | 
| Published | 
            IEEE
    
        22.10.2021
     | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 2688-0938 | 
| DOI | 10.1109/CAC53003.2021.9727351 | 
Cover
| Summary: | The Brain-Computer Interface (BCI) systems play an important role in the Rehabilitation therapy, Smart-home, Intelligent Transportation fields. To the best of our knowledge, different from the cross-trial and cross-run tasks, the data privacy of the huge amount of datasets from multiple subjects prevents the research for the cross-subject transfer learning of the BCI classification task. In this paper, a simple federated transfer framework, namely Federated Transfer Network with Quanvolutional Architecture (FTL-QL), is proposed to overcome this problem. The Riemannian spatial Encoder-Decoder backbone that contains the Quanvolutional and Encoder-Decoder layers to execute the quantum, Manifold Riemannian Coding and Log-Euclidean Riemannian Decoding computation to extract the discriminative information features for cross-subjects' transfer learning. Then the Federated module which calculated by the FederatedAveraging method to train the top layer of the FTL-QL for each subject. The performance of the FTL-QL is benchmarked on the EEG Motor Imagery datasets. Several experiments about the BCI classification task show the proposed FTL-QL can achieve superior learning performance for Cross-subject transfer learning. | 
|---|---|
| ISSN: | 2688-0938 | 
| DOI: | 10.1109/CAC53003.2021.9727351 |