A lightweight convolutional transformer neural network for EEG-based depression recognition
•We propose an EEG-based depression recognition model called Lightweight Convolutional Transformer Neural Network (LCTNN).•We use two methods to reduce computational complexity and improve classification efficiency.•Through extensive experiments on two datasets, experimental results validate the eff...
        Saved in:
      
    
          | Published in | Biomedical signal processing and control Vol. 100; p. 107112 | 
|---|---|
| Main Authors | , , , | 
| Format | Journal Article | 
| Language | English | 
| Published | 
            Elsevier Ltd
    
        01.02.2025
     | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 1746-8094 | 
| DOI | 10.1016/j.bspc.2024.107112 | 
Cover
| Summary: | •We propose an EEG-based depression recognition model called Lightweight Convolutional Transformer Neural Network (LCTNN).•We use two methods to reduce computational complexity and improve classification efficiency.•Through extensive experiments on two datasets, experimental results validate the effectiveness of LCTNN.•Compared to other baseline models, LCTNN achieves state-of-the-art performance on most metrics.
Depression is a serious mental health condition affecting hundreds of millions of people worldwide. Electroencephalogram (EEG) is a spontaneous and rhythmic physiological signal capable of measuring the brain activity of subjects, serving as an objective biomarker for depression research. This paper proposes a lightweight Convolutional Transformer neural network (LCTNN) for depression identification. LCTNN features three significant characteristics: (1) It combines the advantages of both CNN and Transformer to learn rich EEG signal representations from local to global perspectives in time domain. (2) Channel Modulator (CM) dynamically adjusts the contribution of each electrode channel of the EEG signal to depression identification. (3) Considering the high temporal resolution of EEG signals imposes a significant burden on computing self-attention, LCTNN replaces canonical self-attention with sparse attention, reducing its spatiotemporal complexity to O(LlogL). Furthermore, this paper incorporates an attention pooling operation between two Transformer layers, further reducing the spatial complexity. Compared to other deep learning methods, LCTNN achieved state-of-the-art performance on the majority of metrics across two datasets. This indicates that LCTNN offers new insights into the relationship between EEG signals and depression, providing a valuable reference for the future development of depression diagnosis and treatment. | 
|---|---|
| ISSN: | 1746-8094 | 
| DOI: | 10.1016/j.bspc.2024.107112 |