## Abstract

The first part of this article presents the Monte Carlo implementation of the E step of the EM algorithm. Given the current guess to the maximizer of the posterior distribution, latent data patterns are generated from the conditional predictive distribution. The expected value of the augmented log-posterior is then updated as a mixture of augmented log-posteriors, mixed over the generated latent data patterns (multiple imputations). In the M step of the algorithm, this mixture is maximized to obtain the update to the maximizer of the observed posterior. The gradient and Hessian of the observed log posterior are also expressed as mixtures, mixed over the multiple imputations. The relation between the Monte Carlo EM (MCEM) algorithm and the data augmentation algorithm is noted. Two modifications to the MCEM algorithm (the poor man’s data augmentation algorithms), which allow for the calculation of the entire posterior, are then presented. These approximations serve as diagnostics for the validity of the normal approximation to the posterior, as well as starting points for the full data augmentation analysis. The methodology is illustrated with two examples.

Original language | English (US) |
---|---|

Pages (from-to) | 699-704 |

Number of pages | 6 |

Journal | Journal of the American Statistical Association |

Volume | 85 |

Issue number | 411 |

DOIs | |

State | Published - Sep 1990 |

## Keywords

- Bayesian inference
- Multiple imputation
- Simulation

## ASJC Scopus subject areas

- Statistics and Probability
- Statistics, Probability and Uncertainty