Bayesian variable selection for high dimensional generalized linear models: Convergence rates of the fitted densities

Wenxin Jiang*

*Corresponding author for this work

Research output: Contribution to journalArticle

44 Scopus citations

Abstract

Bayesian variable selection has gained much empirical success recently in a variety of applications when the number K of explanatory variables (x 1, . . . , xK) is possibly much larger than the sample size n. For generalized linear models, if most of the xj's have very small effects on the response y, we show that it is possible to use Bayesian variable selection to reduce overfitting caused by the curse of dimensionality K ≫ n. In this approach a suitable prior can be used to choose a few out of the many xj's to model y, so that the posterior will propose probability densities p that are "often close" to the true density p in some sense. The closeness can be described by a Hellinger distance between p and p that scales at a power very close to n-1/2, which is the "finite-dimensional rate" corresponding to a low-dimensional situation. These findings extend some recent work of Jiang [Technical Report 05-02 (2005) Dept. Statistics, Northwestern Univ.] on consistency of Bayesian variable selection for binary classification.

Original languageEnglish (US)
Pages (from-to)1487-1511
Number of pages25
JournalAnnals of Statistics
Volume35
Issue number4
DOIs
StatePublished - Aug 1 2007

Keywords

  • Convergence rates
  • Generalized linear models
  • High dimensional data
  • Posterior distribution
  • Prior distribution
  • Sparsity
  • Variable selection

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint Dive into the research topics of 'Bayesian variable selection for high dimensional generalized linear models: Convergence rates of the fitted densities'. Together they form a unique fingerprint.

Cite this