A novel technique for identifying attentional selection in a dichotic environment
Healthy humans have an innate ability to concentrate on the voice of their choice even in noisy surroundings. But, a complete understanding of the process of segregation and selection of a particular sound in brain is still unclear. Recent studies have successfully demonstrated reconstruction of sti...
Saved in:
| Published in | Annual IEEE India Conference pp. 1 - 5 |
|---|---|
| Main Authors | , , , |
| Format | Conference Proceeding |
| Language | English |
| Published |
IEEE
01.12.2016
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2325-9418 |
| DOI | 10.1109/INDICON.2016.7838885 |
Cover
| Summary: | Healthy humans have an innate ability to concentrate on the voice of their choice even in noisy surroundings. But, a complete understanding of the process of segregation and selection of a particular sound in brain is still unclear. Recent studies have successfully demonstrated reconstruction of stimuli speech envelopes through mathematical modeling. In order to determine the attentional focus of the listener in multi-speaker settings, the existing models rely on the correlation between the reconstructed speech signals and the electroencephalogram (EEG) signals acquired while listening to the actual speech. However, realization of these type of models requires substantial time to reconstruct the stimulus and classify the direction of attention. Present study, proposes a novel solution for "cocktail party problem" by using machine learning approach. In this work, classification features viz. standard deviation, mean absolute values, mean absolute deviation and root-mean-square values were extracted from EEG data. The extracted features were fed into the artificial neural network (ANN) model with randomized sub-sampling procedure. The final outcomes showed ceiling level of performance to predict the attentional focus within subjects. These findings attest the robustness of the developed model for auditory stream segregation. |
|---|---|
| ISSN: | 2325-9418 |
| DOI: | 10.1109/INDICON.2016.7838885 |