A note on mixtures of experts for multiclass responses: Approximation rate and consistent bayesian inference

Yang Ge*, Wenxin Jiang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

We report that mixtures of m multinomial logistic regression can be used to approximate a class of 'smooth' probability models for multiclass responses. With bounded second derivatives of log-odds, the approximation rate is O(m -2/s) in Hellinger distance or O(m-4/s) in Kullback-Leibler divergence. Here s = dim(x) is the dimension of the input space (or the number of predictors With the availability of training data of size n, we also show that 'consistency' in multiclass regression and classification can be achieved, simultaneously for all classes, when posterior based inference is performed in a Bayesian framework. Loosely speaking, such 'consistency' refers to performance being often close to the best possible for large n. Consistency can be achieved either by taking m = mn, or by taking m to be uniformly distributed among {1,...,mn} according to the prior, where 1 mn na in order as n grows, for some a ∈; (0, 1).

Original languageEnglish (US)
Title of host publicationICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Pages329-335
Number of pages7
Volume2006
StatePublished - Oct 6 2006
EventICML 2006: 23rd International Conference on Machine Learning - Pittsburgh, PA, United States
Duration: Jun 25 2006Jun 29 2006

Other

OtherICML 2006: 23rd International Conference on Machine Learning
CountryUnited States
CityPittsburgh, PA
Period6/25/066/29/06

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint Dive into the research topics of 'A note on mixtures of experts for multiclass responses: Approximation rate and consistent bayesian inference'. Together they form a unique fingerprint.

Cite this