Modeling agreement among raters

Martin A. Tanner, Michael A. Young

Research output: Contribution to journalArticlepeer-review

124 Scopus citations


An approach to the modeling of agreement among raters is proposed. By examining a hierarchy of log-linear models, it is shown how one can analyze the agreement among the raters in a manner analogous to the analysis of association in a contingency table. Specific attention is given to the problems of the K-rater agreement and the agreement between several observers and a standard. Examples are used to illustrate how this approach provides a general framework for modeling agreement in a variety of problem situations.

Original languageEnglish (US)
Pages (from-to)175-180
Number of pages6
JournalJournal of the American Statistical Association
Issue number389
StatePublished - Mar 1985


  • Agreement
  • Categorical data
  • Kappa
  • Kappa

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Modeling agreement among raters'. Together they form a unique fingerprint.

Cite this