Supervised nonnegative matrix factorization with Dual-Itakura-Saito and Kullback-Leibler divergences for music transcription

In this paper, we present a convex-analytic approach to supervised nonnegative matrix factorization (SNMF) based on the Dual-Itakura-Saito (Dual-IS) and Kullback-Leibler (KL) divergences for music transcription. The Dual-IS and KL divergences define convex fidelity functions, whereas the IS divergen...

Full description

Saved in:
Bibliographic Details
Published in2016 24th European Signal Processing Conference (EUSIPCO) pp. 1138 - 1142
Main Authors Kagami, Hideaki, Yukawa, Masahiro
Format Conference Proceeding
LanguageEnglish
Published EURASIP 01.08.2016
Subjects
Online AccessGet full text
ISSN2076-1465
DOI10.1109/EUSIPCO.2016.7760426

Cover

More Information
Summary:In this paper, we present a convex-analytic approach to supervised nonnegative matrix factorization (SNMF) based on the Dual-Itakura-Saito (Dual-IS) and Kullback-Leibler (KL) divergences for music transcription. The Dual-IS and KL divergences define convex fidelity functions, whereas the IS divergence defines a nonconvex one. The SNMF problem is formulated as minimizing the divergence-based fidelity function penalized by the ℓ 1 and row-block ℓ 1 norms subject to the nonnegativity constraint. Simulation results show that (i) the use of the Dual-IS and KL divergences yields better performance than the squared Euclidean distance and that (ii) the use of the Dual-IS divergence prevents from false alarms efficiently.
ISSN:2076-1465
DOI:10.1109/EUSIPCO.2016.7760426