On consistency of bayesian inference with mixtures of logistic regression

Yang Ge*, Wenxin Jiang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

This is a theoretical study of the consistency properties of Bayesian inference using mixtures of logistic regression models. When standard logistic regression models are combined in a mixtures-of-experts setup, a flexible model is formed to model the relationship between a binary (yes-no) response y and a vector of predictors x. Bayesian inference conditional on the observed data can then be used for regression and classification. This letter gives conditions on choosing the number of experts (i.e., number of mixing components) k or choosing a prior distribution for k, so that Bayesian inference is consistent, in the sense of often approximating the underlying true relationship between y and x. The resulting classification rule is also consistent, in the sense of having near-optimal performance in classification. We show these desirable consistency properties with a nonstochastic k growing slowly with the sample size n of the observed data, or with a random k that takes large values with nonzero but small probabilities.

Original languageEnglish (US)
Pages (from-to)224-243
Number of pages20
JournalNeural Computation
Volume18
Issue number1
DOIs
StatePublished - Jan 2006

ASJC Scopus subject areas

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint Dive into the research topics of 'On consistency of bayesian inference with mixtures of logistic regression'. Together they form a unique fingerprint.

Cite this