A monte carlo implementation of the em algorithm and the poor man’s data augmentation algorithms

Greg C G Wei, Martin A. Tanner

Research output: Contribution to journalArticlepeer-review

823 Scopus citations

Abstract

The first part of this article presents the Monte Carlo implementation of the E step of the EM algorithm. Given the current guess to the maximizer of the posterior distribution, latent data patterns are generated from the conditional predictive distribution. The expected value of the augmented log-posterior is then updated as a mixture of augmented log-posteriors, mixed over the generated latent data patterns (multiple imputations). In the M step of the algorithm, this mixture is maximized to obtain the update to the maximizer of the observed posterior. The gradient and Hessian of the observed log posterior are also expressed as mixtures, mixed over the multiple imputations. The relation between the Monte Carlo EM (MCEM) algorithm and the data augmentation algorithm is noted. Two modifications to the MCEM algorithm (the poor man’s data augmentation algorithms), which allow for the calculation of the entire posterior, are then presented. These approximations serve as diagnostics for the validity of the normal approximation to the posterior, as well as starting points for the full data augmentation analysis. The methodology is illustrated with two examples.

Original languageEnglish (US)
Pages (from-to)699-704
Number of pages6
JournalJournal of the American Statistical Association
Volume85
Issue number411
DOIs
StatePublished - Sep 1990

Keywords

  • Bayesian inference
  • Multiple imputation
  • Simulation

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint Dive into the research topics of 'A monte carlo implementation of the em algorithm and the poor man’s data augmentation algorithms'. Together they form a unique fingerprint.

Cite this