Bayesian inference for hierarchical mixtures-of-experts with applications to regression and classification

Robert A. Jacobs*, Martin A Tanner, Fengchun Peng

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


This paper studies the problems of inference and prediction in a class of models known as hierarchical mixtures-of-experts (HME). The statistical model underlying an HME is a mixture model in which both the mixture coefficients and the mixture components are generalized linear models. Bayesian inference regarding an HME's parameters is presented in the contexts of regression and classification using Markov chain Monte Carlo methods. A benefit of this Bayesian approach is the ability to obtain a sample from the posterior distribution of any functional of the parameters of the given model. In this way, more information is obtained than provided by a point estimate. The methods are illustrated on a nonlinear regression problem and on a breast cancer classification problem. The results indicate that the HME showed good prediction performance, and also gave the additional benefit of providing for the opportunity to assess the degree of certainty of the model in its predictions.

Original languageEnglish (US)
Pages (from-to)375-390
Number of pages16
JournalStatistical Methods in Medical Research
Issue number4
StatePublished - Jan 1 1996

ASJC Scopus subject areas

  • Epidemiology
  • Statistics and Probability
  • Health Information Management

Fingerprint Dive into the research topics of 'Bayesian inference for hierarchical mixtures-of-experts with applications to regression and classification'. Together they form a unique fingerprint.

Cite this