On the identifiability of mixtures-of-experts

Wenxin Jiang*, Martin A Tanner

*Corresponding author for this work

Research output: Contribution to journalArticle

42 Scopus citations

Abstract

In mixtures-of-experts (ME) models, 'experts' of generalized linear models are combined, according to a set of local weights called the 'gating function'. The invariant transformations of the ME probability density functions include the permutations of the expert labels and the translations of the parameters in the gating functions. Under certain conditions, we show that the ME systems are identifiable if the experts are ordered and the gating parameters are initialized. The conditions are validated for Poisson, gamma, normal and binomial experts.

Original languageEnglish (US)
Pages (from-to)1253-1258
Number of pages6
JournalNeural Networks
Volume12
Issue number9
DOIs
StatePublished - Nov 1 1999

Keywords

  • Generalized linear models
  • Indentifiability
  • Invariant transformations
  • Mixtures-of-experts

ASJC Scopus subject areas

  • Artificial Intelligence
  • Neuroscience(all)

Fingerprint Dive into the research topics of 'On the identifiability of mixtures-of-experts'. Together they form a unique fingerprint.

  • Cite this