Supervised nonnegative matrix factorization with Dual-Itakura-Saito and Kullback-Leibler divergences for music transcription
In this paper, we present a convex-analytic approach to supervised nonnegative matrix factorization (SNMF) based on the Dual-Itakura-Saito (Dual-IS) and Kullback-Leibler (KL) divergences for music transcription. The Dual-IS and KL divergences define convex fidelity functions, whereas the IS divergen...
Saved in:
| Published in | 2016 24th European Signal Processing Conference (EUSIPCO) pp. 1138 - 1142 |
|---|---|
| Main Authors | , |
| Format | Conference Proceeding |
| Language | English |
| Published |
EURASIP
01.08.2016
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2076-1465 |
| DOI | 10.1109/EUSIPCO.2016.7760426 |
Cover
| Summary: | In this paper, we present a convex-analytic approach to supervised nonnegative matrix factorization (SNMF) based on the Dual-Itakura-Saito (Dual-IS) and Kullback-Leibler (KL) divergences for music transcription. The Dual-IS and KL divergences define convex fidelity functions, whereas the IS divergence defines a nonconvex one. The SNMF problem is formulated as minimizing the divergence-based fidelity function penalized by the ℓ 1 and row-block ℓ 1 norms subject to the nonnegativity constraint. Simulation results show that (i) the use of the Dual-IS and KL divergences yields better performance than the squared Euclidean distance and that (ii) the use of the Dual-IS divergence prevents from false alarms efficiently. |
|---|---|
| ISSN: | 2076-1465 |
| DOI: | 10.1109/EUSIPCO.2016.7760426 |