kappa coefficient (statistics)
GPTKB entity
Statements (34)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:statistical_analysis
|
| gptkbp:alsoKnownAs |
gptkb:Cohen's_kappa
|
| gptkbp:appliesTo |
categorical data
|
| gptkbp:category |
reliability statistics
|
| gptkbp:form |
(Po - Pe) / (1 - Pe)
|
| gptkbp:generalizes |
gptkb:Fleiss'_kappa
quadratic weighted kappa weighted kappa |
| gptkbp:interpretedBy |
Landis and Koch scale
|
| gptkbp:introducedIn |
1960
|
| gptkbp:limitation |
sensitive to prevalence
affected by bias does not distinguish between types of disagreement |
| gptkbp:measures |
inter-rater agreement
|
| gptkbp:notRecommendedFor |
ordinal data without modification
|
| gptkbp:Pe |
expected agreement by chance
|
| gptkbp:Po |
observed agreement
|
| gptkbp:proposedBy |
gptkb:Jacob_Cohen
|
| gptkbp:range |
-1 to 1
|
| gptkbp:relatedTo |
gptkb:Fleiss'_kappa
Scott's pi Krippendorff's alpha |
| gptkbp:usedFor |
assessing reliability of classification
measuring agreement between two raters |
| gptkbp:usedIn |
medical research
psychometrics statistics content analysis |
| gptkbp:valueLessThan0Means |
less than chance agreement
|
| gptkbp:valueOf0Means |
agreement equivalent to chance
|
| gptkbp:valueOf1Means |
perfect agreement
|
| gptkbp:bfsParent |
gptkb:kappa_(κ)
|
| gptkbp:bfsLayer |
8
|
| https://www.w3.org/2000/01/rdf-schema#label |
kappa coefficient (statistics)
|