Examining the kappa of classification

Cohen's kappa measures the agreement between target and predicted class similar to accuracy, but it also takes into account random chance of getting the predictions. Cohen's kappa is given by the following equation:

Examining the kappa of classification

In this equation, p0 is the relative observed agreement and pe is the random chance of agreement derived from the data. Kappa varies between negative values and one with the following rough categorization from Landis and Koch:

  • Poor agreement: kappa < 0
  • Slight agreement: kappa = 0 to 0.2
  • Fair agreement: kappa = 0.21 to 0.4
  • Moderate agreement: kappa = 0.41 to 0.6
  • Good agreement: kappa = 0.61 to 0.8
  • Very ...

Get Python Data Analysis Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.