článok logo filozofia byrt kappa agreement problém zapletenie kura
Inter-observer variation can be measured in any situation in which two or more independent observers are evaluating the same thing Kappa is intended to. - ppt download
PDF) A Formal Proof of a Paradox Associated with Cohen's Kappa.
PDF) Beyond Kappa: A Review of Interrater Agreement Measures
Observer agreement paradoxes in 2x2 tables: comparison of agreement measures | BMC Medical Research Methodology | Full Text
Agree or Disagree? A Demonstration of An Alternative Statistic to Cohen's Kappa for Measuring the Extent and Reliability of Ag
PDF] Modification in inter-rater agreement statistics-a new approach | Semantic Scholar
PDF) New Interpretations of Cohen's Kappa
Stats: What is a Kappa coefficient? (Cohen's Kappa)