Investigating Science Education Effect Sizes: Implications for Power Analyses and Programmatic Decisions

Joseph A. Taylor, Susan M. Kowalski, Joshua R. Polanin, Karen Askinas, Molly A.M. Stuhlsatz, Christopher D. Wilson, Elizabeth Tipton, Sandra Jo Wilson

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

A priori power analyses allow researchers to estimate the number of participants needed to detect the effects of an intervention. However, power analyses are only as valid as the parameter estimates used. One such parameter, the expected effect size, can vary greatly depending on several study characteristics, including the nature of the intervention, developer of the outcome measure, and age of the participants. Researchers should understand this variation when designing studies. Our meta-analysis examines the relationship between science education intervention effect sizes and a host of study characteristics, allowing primary researchers to access better estimates of effect sizes for a priori power analyses. The results of this meta-analysis also support programmatic decisions by setting realistic expectations about the typical magnitude of impacts for science education interventions.

Original languageEnglish (US)
JournalAERA Open
Volume4
Issue number3
DOIs
StatePublished - Jul 1 2018
Externally publishedYes

Keywords

  • effect size
  • meta-analysis
  • program evaluation
  • science education
  • statistics
  • student achievement

ASJC Scopus subject areas

  • Education
  • Social Sciences (miscellaneous)
  • Developmental and Educational Psychology

Fingerprint

Dive into the research topics of 'Investigating Science Education Effect Sizes: Implications for Power Analyses and Programmatic Decisions'. Together they form a unique fingerprint.

Cite this