Kappa coefficient: a popular measure of rater agreement

Wan Tang*, Jun Hu, Hui Zhang, Pan Wu, Hua He

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

181 Scopus citations

Abstract

Summary: In mental health and psychosocial studies it is often necessary to report on the between-rater agreement of measures used in the study. This paper discusses the concept of agreement, highlighting its fundamental difference from correlation. Several examples demonstrate how to compute the kappa coefficient - a popular statistic for measuring agreement - both by hand and by using statistical software packages such as SAS and SPSS. Real study data are used to illustrate how to use and interpret this coefficient in clinical research and practice. The article concludes with a discussion of the limitations of the coefficient.

Original languageEnglish (US)
Pages (from-to)62-67
Number of pages6
JournalShanghai Archives of Psychiatry
Volume27
Issue number1
DOIs
StatePublished - 2015

Keywords

  • Correlation
  • Interrater agreement
  • Kappa coefficient
  • Weighted kappa

ASJC Scopus subject areas

  • Neurology
  • Psychiatry and Mental health

Fingerprint

Dive into the research topics of 'Kappa coefficient: a popular measure of rater agreement'. Together they form a unique fingerprint.

Cite this