Difference between revisions of "Inter-Rater Reliability"

From RCampus Wiki
Jump to: navigation, search
(Created page with "== What is Inter-Rater Reliability == Inter-rater reliability is the degree of agreement among independent reviewers who rate or assess the same body of work. Rubrics and oth...")
 
Line 13: Line 13:
  
 
[[Category:Rubrics]]
 
[[Category:Rubrics]]
 +
[[Category:Reports]]

Revision as of 17:34, 10 March 2022

What is Inter-Rater Reliability

Inter-rater reliability is the degree of agreement among independent reviewers who rate or assess the same body of work.

Rubrics and other assessment tools that rely on ratings must exhibit good inter-rater reliability, otherwise they are not reliable evaluations. There are a number of statistical methods that can be used to calculate inter-rater reliability. Different statistics are appropriate for different types of measurement.

How to generate Inter-Rater Reliability Reports in iRubric

iRubric Enterprise Edition has built-in functionality for Inter-Reliability reports. To generate this report:

  1. Open a rubric in the gallery for viewing
  2. Click on Inter-Rater Reliability button on top of the screen
  3. Follow the screens to generate the Inter-Rater Reliability report