Beyond Kappa: Estimating Inter-Rater Agreement with Nominal Classifications
Article Sidebar
Published
May 1, 2009
Main Article Content
Nol Bendermacher
Radboud University, Nijmegen, The Netherlands
Pierre Souren
Radboud University, Nijmegen, The Netherlands
Abstract
Cohen’s Kappa and a number of related measures can all be criticized for their definition of correction for chance agreement. A measure is introduced that derives the corrected proportion of agreement directly from the data, thereby overcoming objections to Kappa and its related measures.
Article Details
Issue
Section
Articles