Search results

  1. M

    intuitively or statistically: is mean interrater- reliability useful?

    hello, i am a "non - statistician" writing my masters in health research and i have ( many, but for now) one question: I`ll be using Cohens Kappa with its many problems and also gwets AC to calculate IRR in my thesis. I`m now writing the litterature review, and find methodological review...
  2. M

    Rater agreement calculation vs crosstabs

    i guess what i am asking about crosstabs is: is rater agreement the same as the specificity or sensitivity percentage?
  3. M

    Rater agreement calculation vs crosstabs

    Hello :) statistic noob here. this is a general statistics question and not directly an spss problem, so im posting it here:) I tried calculating the percent agreement between two raters two ways: 1 : crosstabs in spss variable A x Variable B 2: simple calculation in spss by making a new...
  4. M

    crosstabs ( for cohens kappa) not available with splitfile

    Hello, I'm validating a questionaire for inter-rater reliability ( two raters) using kappa. I am a masters student, not statistics. I have an ordinal categorical variable and wish to get one kappa value + percent agreement per category ( I realize weighted kappa may be more appropriate for...