The Cohen's Kappa agreement

Modified on Mon, 30 Aug 2021 at 09:26 AM

Definition


The Cohen's Kappa agreement is a measure of inter-rater or intra-rater reliability for qualitative variables scoring the results of a test procedure. Inter-reliability is the agreement between examinators. Intra-reliability is the stability of a measure over time.


In practice


This test is used when one wants to compare the percentages of agreement between assessments of different examinators or between assessments at different times. 

For example, one could be interested in knowing if the coronavirus test procedure X gives the same results 1) when the test is carry out by two different nurses 2) at 15 minutes apart. The point 1) assess the inter-rater reliability of the test, the point 2) assess the intra-rater reliability of the test.
To do this test, go to "Test 2 variables" and chose two categorical variables


How to interprate the Cohen's Kappa agreement ?


The indice of agreement are commonly interprated as (Landis, Koch, 1977):

  • < 0: No agreement
  • 0.0 - 0.20: Very low agreement
  • 0.21- 0.40: Low agreement
  • 0.41- 0.60: Moderate agreement
  • 0.61 - 0.80: High agreement
  • 0.81- 1.00: Quasi perfect agreement


How to use it on EasyMedStat ?


  1. Go to Statistics > Test variables
  2. Select a first List or Yes-no variable that represents the first observer
  3. Select a second List or Yes-no variable that represents the second observer
  4. Click on "Find an agreement between observers (Kappa)"


The variables you want to compare need to have the same number of modalities and the same name of modalities.

For example, if the first variables has two possible values "Positive" and "Negative", the second variable should have the same possible values "Positive" and "Negative". 


See also





Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select atleast one of the reasons

Feedback sent

We appreciate your effort and will try to fix the article