Reconsidering statistical methods for assessing replication.

J. M. Schauer*, L. V. Hedges

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Recent empirical evaluations of replication in psychology have reported startlingly few successful replication attempts. At the same time, these programs have noted that the proper way to analyze replication studies is far from a settled matter and have analyzed their data in several different ways. This presents 2 challenges to interpreting the results of these programs. First, different analysis methods assess different operational definitions of replication. Second, the properties of these methods are not necessarily common knowledge; it is possible for a successful replication to be deemed a failure by nearly all of the metrics used, and it is not always immediately clear how likely such errors are to occur. In this article, we describe the methods commonly used in replication research and how they imply specific operational definitions of replication. We then compute the probability of false failure (i.e., a successful replication is concluded to have failed) and false success determinations. These are shown to be high (often over 50%) and in many cases uncontrolled. We then demonstrate that errors are probable in the data to which these methods have been applied in the literature. We show that the probability that some reported conclusions about replication are incorrect can be as high as 75–80%. (PsycInfo Database Record (c) 2021 APA, all rights reserved) Translational Abstract—This article examines analysis methods for studying replications, and it finds that some of the commonly used methods have severe limitations. These limitations include the fact that a “replication failure” could arise by chance alone with a surprisingly high probability, even if the studies successfully replicate. (PsycInfo Database Record (c) 2021 APA, all rights reserved)

Original languageEnglish (US)
Pages (from-to)127-139
Number of pages13
JournalPsychological methods
Volume26
Issue number1
DOIs
StatePublished - Feb 2021

Keywords

  • error
  • hypothesis testing
  • meta-analysis
  • replication

ASJC Scopus subject areas

  • Psychology (miscellaneous)

Fingerprint Dive into the research topics of 'Reconsidering statistical methods for assessing replication.'. Together they form a unique fingerprint.

Cite this