Optimal weight learning for coupled tensor factorization with mixed divergences

Umut Simsekli, Beyza Ermis, Taylan Cemgil, Acar Ataman Evrim

7 Citationer (Scopus)

Abstract

Incorporating domain-specific side information via coupled factorization models is useful in source separation applications. Coupled models can easily incorporate information from source modalities with different statistical properties by estimating shared factors via divergence minimization. Here, it is useful to use mixed divergences, a specific divergence for each modality. However, this extra freedom requires choosing the correct divergence as well as an optimal weighting mechanism to select the relative 'importance'. In this paper, we present an approach for determining the relative weights, framed as dispersion parameter estimation, based on an inference framework introduced previously as Generalized Coupled Tensor Factorization (GCTF). The dispersion parameters play a key role on inference as they form a balance between the information obtained from multimodal observations. We tackle the problem of optimal weighting by maximum likelihood exploiting the relation between β-divergences and the family of Tweedie distributions. We demonstrate the usefulness of our approach on a drum source separation application.

OriginalsprogEngelsk
Publikationsdato2013
Antal sider5
StatusUdgivet - 2013
BegivenhedEUSIPCO 2013 - , Tyrkiet
Varighed: 1 jun. 2013 → …

Konference

KonferenceEUSIPCO 2013
Land/OmrådeTyrkiet
Periode01/06/2013 → …

Fingeraftryk

Dyk ned i forskningsemnerne om 'Optimal weight learning for coupled tensor factorization with mixed divergences'. Sammen danner de et unikt fingeraftryk.

Citationsformater