Abstract
We report that mixtures of m multinomial logistic regression can be used to approximate a class of 'smooth' probability models for multiclass responses. With bounded second derivatives of log-odds, the approximation rate is O(m -2/s) in Hellinger distance or O(m-4/s) in Kullback-Leibler divergence. Here s = dim(x) is the dimension of the input space (or the number of predictors With the availability of training data of size n, we also show that 'consistency' in multiclass regression and classification can be achieved, simultaneously for all classes, when posterior based inference is performed in a Bayesian framework. Loosely speaking, such 'consistency' refers to performance being often close to the best possible for large n. Consistency can be achieved either by taking m = mn, or by taking m to be uniformly distributed among {1,...,mn} according to the prior, where 1 mn na in order as n grows, for some a ∈; (0, 1).
Original language | English (US) |
---|---|
Title of host publication | ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning |
Pages | 329-335 |
Number of pages | 7 |
Volume | 2006 |
State | Published - Oct 6 2006 |
Event | ICML 2006: 23rd International Conference on Machine Learning - Pittsburgh, PA, United States Duration: Jun 25 2006 → Jun 29 2006 |
Other
Other | ICML 2006: 23rd International Conference on Machine Learning |
---|---|
Country | United States |
City | Pittsburgh, PA |
Period | 6/25/06 → 6/29/06 |
ASJC Scopus subject areas
- Engineering(all)