Stata Help

Calculating Interrater Reliability

Calculating interrater agreement with Stata is done using the kappa and kap commands. Which of the two commands you use will depend on how your data is entered. However, past this initial difference, the two commands have the same syntax. Click here to learn the difference between the kappa and kap commands

Once you know what data formats are required for kappa and kap, simply click the link below which matches your situation to see instructions.

Kappa goes from zero (no agreement) to one (perfect agreement). Stata suggests the following guidelines from Landis & Koch (1977) as to what agreement level a particular kappa value constitutes:

  • 0.0 - .20: slight
  • .21 - .40: fair
  • .41 - .60: moderate
  • .61 - .90: substantial
  • .81 - 1: almost perfect

Different Instances of Kappa

Back to Tutorials