Kappa statistic

From Citizendium
Revision as of 07:45, 27 December 2007 by imported>Robert Badgett (Adapted from WP)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Interpretation

Landis and Koch[1] proposed the schema in the table below for interpreting values.

Proposed interpretation of values
Interpretation
< 0 Poor agreement
0.0 — 0.20 Slight agreement
0.21 — 0.40 Fair agreement
0.41 — 0.60 Moderate agreement
0.61 — 0.80 Substantial agreement
0.81 — 1.00 Almost perfect agreement

References

  1. Landis JR, Koch GG (1977). "The measurement of observer agreement for categorical data". Biometrics 33 (1): 159–74. PMID 843571[e]