A note on some algorithms for the Gibbs posterior

Kun Chen, Wenxin Jiang*, Martin A. Tanner

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

Jiang and Tanner (2008) consider a method of classification using the Gibbs posterior which is directly constructed from the empirical classification errors. They propose an algorithm to sample from the Gibbs posterior which utilizes a smoothed approximation of the empirical classification error, via a Gibbs sampler with augmented latent variables. In this paper, we note some drawbacks of this algorithm and propose an alternative method for sampling from the Gibbs posterior, based on the Metropolis algorithm. The numerical performance of the algorithms is examined and compared via simulated data. We find that the Metropolis algorithm produces good classification results at an improved speed of computation.

Original languageEnglish (US)
Pages (from-to)1234-1241
Number of pages8
JournalStatistics and Probability Letters
Volume80
Issue number15-16
DOIs
StatePublished - Aug 2010

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'A note on some algorithms for the Gibbs posterior'. Together they form a unique fingerprint.

Cite this