Learning Data Representation and Emotion Assessment from Physiological Data
Aiming a deeper understanding of human emotional states, we explore deep learning techniques for the analysis of physiological data. In this work, raw two-channel pre-frontal electroencephalography and photoplethysmography signals of 25 subjects were collected using EMOTAI's headband while watc...
        Saved in:
      
    
          | Published in | Proceedings of the ... IEEE International Conference on Acoustics, Speech and Signal Processing (1998) pp. 3452 - 3456 | 
|---|---|
| Main Authors | , , , , , , | 
| Format | Conference Proceeding | 
| Language | English | 
| Published | 
            IEEE
    
        01.05.2020
     | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 2379-190X | 
| DOI | 10.1109/ICASSP40776.2020.9054498 | 
Cover
| Summary: | Aiming a deeper understanding of human emotional states, we explore deep learning techniques for the analysis of physiological data. In this work, raw two-channel pre-frontal electroencephalography and photoplethysmography signals of 25 subjects were collected using EMOTAI's headband while watching commercials. Taking as input the raw data, convolutional neural networks were used to learn data representations and classify the acquired signals according to the Positive and Negative Affect Schedule. This approach achieved promising results, with average F1-scores of 76.6% for Positive Affect and 83.3% for Negative Affect. Interpretation of the learned data representation was attempted by computing correlation values between features extracted from the raw inputs and the final classification. The features with the most significant correlations were the alpha band power, and the asymmetry and phase synchronization indexes. The extracted features seem to match the ones learnt by the neural network, hence endorsing their validity for emotional studies. | 
|---|---|
| ISSN: | 2379-190X | 
| DOI: | 10.1109/ICASSP40776.2020.9054498 |