EmoMatchSpanishDB: study of speech emotion recognition machine learning models in a new Spanish elicited database

In this paper we present a new speech emotion dataset on Spanish. The database is created using an elicited approach and is composed by fifty non-actors expressing the Ekman’s six basic emotions of anger, disgust, fear, happiness, sadness, and surprise, plus neutral tone. This article describes how...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 83; no. 5; pp. 13093 - 13112
Main Authors Garcia-Cuesta, Esteban, Salvador, Antonio Barba, Pãez, Diego Gachet
Format Journal Article
LanguageEnglish
Published New York Springer US 01.02.2024
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1573-7721
1380-7501
1573-7721
DOI10.1007/s11042-023-15959-w

Cover

More Information
Summary:In this paper we present a new speech emotion dataset on Spanish. The database is created using an elicited approach and is composed by fifty non-actors expressing the Ekman’s six basic emotions of anger, disgust, fear, happiness, sadness, and surprise, plus neutral tone. This article describes how this database has been created from the recording step to the performed crowdsourcing perception test step. The crowdsourcing has facilitated to statistically validate the emotion of each collected audio sample and also to filter noisy data samples. Hence we obtained two datasets EmoSpanishDB and EmoMatchSpanishDB. The first includes those recorded audios that had consensus during the crowdsourcing process. The second selects from EmoSpanishDB only those audios whose emotion also matches with the originally elicited. Last, we present a baseline comparative study between different state of the art machine learning techniques in terms of accuracy, precision, and recall for both datasets. The results obtained for EmoMatchSpanishDB improves the ones obtained for EmoSpanishDB and thereof, we recommend to follow the methodology that was used for the creation of emotional databases.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1573-7721
1380-7501
1573-7721
DOI:10.1007/s11042-023-15959-w