Fleiss' kappa

GPTKB entity

Statements (25)
Predicate Object
gptkbp:instanceOf statistical analysis
gptkbp:alternativeTo percent agreement
gptkbp:appliesTo more than two raters
gptkbp:assumes fixed number of raters per item
gptkbp:citation inter-rater reliability studies
gptkbp:developedBy gptkb:Joseph_L._Fleiss
gptkbp:field medical research
psychometrics
statistics
gptkbp:formula_involves expected agreement by chance
proportion of agreement
https://www.w3.org/2000/01/rdf-schema#label Fleiss' kappa
gptkbp:measures degree of agreement
gptkbp:publishedIn gptkb:Biometrics_journal
1971
gptkbp:range -1 to 1
gptkbp:relatedTo gptkb:Cohen's_kappa
Scott's pi
gptkbp:used_in categorical ratings
gptkbp:usedFor measuring inter-rater reliability
gptkbp:value_0_means agreement equivalent to chance
gptkbp:value_1_means perfect agreement
gptkbp:value_less_than_0_means less than chance agreement
gptkbp:bfsParent gptkb:Joseph_L._Fleiss
gptkbp:bfsLayer 6