Comparison between Inter-rater Reliability and Inter-rater Agreement in Performance AssessmentOriginal Article • August 31st, 2010
Contract Type FiledAugust 31st, 2010Introduction: Over the years, performance assessment (PA) has been widely employed in medical education, Objective Structured Clinical Examination (OSCE) being an excellent ex- ample. Typically, performance assessment involves multiple raters, and therefore, consistency among the scores provided by the auditors is a precondition to ensure the accuracy of the assessment. Inter-rater agreement and inter-rater reliability are two indices that are used to ensure such scoring consistency. This research primarily examined the relationship between inter-rater agreement and inter-rater reliability. Materials and Methods: This study used 3 sets of simulated data that was based on raters’ evaluation of student performance to examine the relationship between inter-rater agreement and inter-rater reliability. Results: Data set 1 had high inter-rater agreement but low inter-rater reliability, data set 2 had high inter-rater reli- ability but low inter-rater agreement, and data set 3 had high inter-