End-to-end Convolutional Neural Networks for Sound Event Detection in Urban Environments
We present a novel approach to tackle the problem of sound event detection (SED) in urban environments using end-to-end convolutional neural networks (CNN). It consists of a 1D CNN for extracting the energy on mel-frequency bands from the audio signal based on a simple filter bank, followed by a 2D...
Saved in:
| Published in | Proceedings of the XXth Conference of Open Innovations Association FRUCT pp. 533 - 539 |
|---|---|
| Main Authors | , , |
| Format | Conference Proceeding |
| Language | English |
| Published |
FRUCT
01.04.2019
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2305-7254 |
| DOI | 10.23919/FRUCT.2019.8711906 |
Cover
| Summary: | We present a novel approach to tackle the problem of sound event detection (SED) in urban environments using end-to-end convolutional neural networks (CNN). It consists of a 1D CNN for extracting the energy on mel-frequency bands from the audio signal based on a simple filter bank, followed by a 2D CNN for the classification task. The main goal of this two-stage architecture is to bring more interpretability to the first layers of the network and to permit their reutilization in other problems of same the domain. We present a novel model to calculate the mel-spectrogam using a neural network that outperforms an existing work, both in its simplicity and its matching performance. Also, we implement a recently proposed approach to normalize the energy of the mel-spectrogram (per channel energy normalization' PCEN) as a layer of the neural network. We show how the parameters of this normalization can be learned by the network and why this is useful for SED on urban environments. We study how the training modifies the filter bank as well as the PCEN normalization parameters. The obtained system achieves classification results that are comparable to the state-of-the-art, while decreasing the number of parameters involved. |
|---|---|
| ISSN: | 2305-7254 |
| DOI: | 10.23919/FRUCT.2019.8711906 |