Supervised nonnegative matrix factorization with Dual-Itakura-Saito and Kullback-Leibler divergences for music transcription

Hideaki Kagami, Masahiro Yukawa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

In this paper, we present a convex-analytic approach to supervised nonnegative matrix factorization (SNMF) based on the Dual-Itakura-Saito (Dual-IS) and Kullback-Leibler (KL) divergences for music transcription. The Dual-IS and KL divergences define convex fidelity functions, whereas the IS divergence defines a nonconvex one. The SNMF problem is formulated as minimizing the divergence-based fidelity function penalized by the ℓ1 and row-block ℓ1 norms subject to the nonnegativity constraint. Simulation results show that (i) the use of the Dual-IS and KL divergences yields better performance than the squared Euclidean distance and that (ii) the use of the Dual-IS divergence prevents from false alarms efficiently.

Original languageEnglish
Title of host publication2016 24th European Signal Processing Conference, EUSIPCO 2016
PublisherEuropean Signal Processing Conference, EUSIPCO
Pages1138-1142
Number of pages5
ISBN (Electronic)9780992862657
DOIs
Publication statusPublished - 2016 Nov 28
Event24th European Signal Processing Conference, EUSIPCO 2016 - Budapest, Hungary
Duration: 2016 Aug 282016 Sept 2

Publication series

NameEuropean Signal Processing Conference
Volume2016-November
ISSN (Print)2219-5491

Other

Other24th European Signal Processing Conference, EUSIPCO 2016
Country/TerritoryHungary
CityBudapest
Period16/8/2816/9/2

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Supervised nonnegative matrix factorization with Dual-Itakura-Saito and Kullback-Leibler divergences for music transcription'. Together they form a unique fingerprint.

Cite this