Q re Kappa statistic

#1
I am trying to calculate inter-rater reliability scores for 10 survey questions-- most of which are binary (yes/no). The agreemeant level bettwen the two raters is 70-90% on nearly every question, however the Kappa score is often very poor (0.2- 0.4).
Can this be right?

Secondly, can you use a kappa score on Likert scale questions? if not, what is the correct test for inter-rater reliability?

Thanks so much for your help!
 

Dason

Ambassador to the humans
#2
I am trying to calculate inter-rater reliability scores for 10 survey questions-- most of which are binary (yes/no). The agreemeant level bettwen the two raters is 70-90% on nearly every question, however the Kappa score is often very poor (0.2- 0.4).
Can this be right?
I don't know too much about kappa but reading this makes it seem like your situation is quite plausible.