Skip to main content
Article
Understanding and Computing Cohen’s Kappa: A Tutorial.
WebPsychEmpiricist. Web Journal at http://wpe.info/ (2007)
  • James M. Wood, University of Texas at El Paso
Abstract

Cohen’s Kappa (Cohen, 1960) is an index of interrater reliability that is commonly used to measure the level of agreement between two sets of dichotomous ratings or scores. This tutorial explains the underlying logic of Kappa and shows why it is superior to simple percentage of agreement as a measure of interrater reliability. Examples demonstrate how to calculate Kappa both by hand and with SPSS.

Keywords
  • kappa; interrater reliability; cohen's kappa; statistics; psychometrics
Publication Date
October 3, 2007
Citation Information
James M. Wood. "Understanding and Computing Cohen’s Kappa: A Tutorial." WebPsychEmpiricist. Web Journal at http://wpe.info/ (2007)
Available at: http://works.bepress.com/james_wood/22/