Inter-annotator agreement (IAA) metrics
Kohen’s Kappa
- sklearn.metrics.cohen_kappa_score — scikit-learn 1.1.1 documentation
- Extremely clear explanation: Cohen’s Kappa Statistic - Statistics How To
- Cohen’s Kappa. Understanding Cohen’s Kappa coefficient | by Kurtis Pykes | Towards Data Science
- Important bits:
- It includes the likelihood that a correct guess happens randomly
- Only 2 annotators
- Compares classification on the same items - so as input it gets y_1 and y_2 of the same length with matching annos
Nel mezzo del deserto posso dire tutto quello che voglio.
comments powered by Disqus