A Bayesian approach to model selection in hierarchical mixtures-of-experts architectures

Robert A. Jacobs*, Fengchun Peng, Martin A. Tanner

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

44 Scopus citations


There does not exist a statistical model that shows good performance on all tasks. Consequently, the model selection problem is unavoidable, investigators must decide which model is best at summarizing the data for each task of interest. This article presents an approach to the model selection problem in hierarchical mixtures-of-experts architectures. These architectures combine aspects of generalized linear models with those of finite mixture models in order to perform tasks via a recursive 'divide-and-conquer' strategy. Markov chain Monte Carlo methodology is used to estimate the distribution of the architectures' parameters. One part of our approach to model selection attempts to estimate the worth of each component of an architecture so that relatively unused components can be pruned from the architecture's structure. A second part of this approach uses a Bayesian hypothesis testing procedure in order to differentiate inputs that carry useful information from nuisance inputs. Simulation results suggest that the approach presented here adheres to the dictum of Occam's razor; simple architectures that are adequate for summarizing the data are favored over more complex structures.

Original languageEnglish (US)
Pages (from-to)231-241
Number of pages11
JournalNeural Networks
Issue number2
StatePublished - Mar 1997


  • Bayesian analysis
  • Gibbs sampling
  • hierarchical architecture
  • model selection
  • modular architecture

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Artificial Intelligence


Dive into the research topics of 'A Bayesian approach to model selection in hierarchical mixtures-of-experts architectures'. Together they form a unique fingerprint.

Cite this