Cohen's kappa

GPTKB entity

Statements (27)
Predicate Object
gptkbp:instanceOf statistical analysis
gptkbp:alternativeTo percent agreement
gptkbp:appliesTo categorical data
gptkbp:category reliability coefficient
gptkbp:form κ = (po - pe) / (1 - pe)
https://www.w3.org/2000/01/rdf-schema#label Cohen's kappa
gptkbp:interpretedBy 0 means agreement equivalent to chance
1 means perfect agreement
negative values mean less than chance agreement
gptkbp:limitation sensitive to bias
sensitive to prevalence
gptkbp:measures inter-rater agreement
gptkbp:pe expected agreement by chance
gptkbp:po observed agreement
gptkbp:proposedBy gptkb:Jacob_Cohen
gptkbp:publishedIn gptkb:Educational_and_Psychological_Measurement
gptkbp:range -1 to 1
gptkbp:relatedTo gptkb:Fleiss'_kappa
Scott's pi
gptkbp:requires two raters
gptkbp:symbol κ
gptkbp:usedIn psychometrics
statistics
content analysis
gptkbp:yearProposed 1960
gptkbp:bfsParent gptkb:Kendall's_W
gptkbp:bfsLayer 6