Adjusting for Publication Bias in Meta-Analysis: An Evaluation of Selection Methods and Some Cautionary Notes

Blakeley B. McShane*, Ulf Böckenholt, Karsten T. Hansen

*Corresponding author for this work

Research output: Contribution to journalArticle

90 Scopus citations

Abstract

We review and evaluate selection methods, a prominent class of techniques first proposed by Hedges (1984) that assess and adjust for publication bias in meta-analysis, via an extensive simulation study. Our simulation covers both restrictive settings as well as more realistic settings and proceeds across multiple metrics that assess different aspects of model performance. This evaluation is timely in light of two recently proposed approaches, the so-called p-curve and p-uniform approaches, that can be viewed as alternative implementations of the original Hedges selection method approach. We find that the p-curve and p-uniform approaches perform reasonably well but not as well as the original Hedges approach in the restrictive setting for which all three were designed. We also find they perform poorly in more realistic settings, whereas variants of the Hedges approach perform well. We conclude by urging caution in the application of selection methods: Given the idealistic model assumptions underlying selection methods and the sensitivity of population average effect size estimates to them, we advocate that selection methods should be used less for obtaining a single estimate that purports to adjust for publication bias ex post and more for sensitivity analysis—that is, exploring the range of estimates that result from assuming different forms of and severity of publication bias.

Original languageEnglish (US)
Pages (from-to)730-749
Number of pages20
JournalPerspectives on Psychological Science
Volume11
Issue number5
DOIs
StatePublished - Sep 1 2016

Keywords

  • effect size
  • meta-analysis
  • p-curve
  • p-uniform
  • selection methods

ASJC Scopus subject areas

  • Psychology(all)

Fingerprint Dive into the research topics of 'Adjusting for Publication Bias in Meta-Analysis: An Evaluation of Selection Methods and Some Cautionary Notes'. Together they form a unique fingerprint.

Cite this