Interrater Agreement of The Copenhagen Triage AlgorithmInterrater Agreement • May 22nd, 2024
Contract Type FiledMay 22nd, 2024Julie Inge-Marie H Borchsenius1*, Rasmus Bo Hasselbalch1, Morten Lind1, Lisbet Ravn1, Thomas Kallemose2, Martin Schultz1, 4, Lars Simon Rasmussen3, 5, Kasper Iversen1, 5
Interrater agreementInterrater Agreement • September 22nd, 2013
Contract Type FiledSeptember 22nd, 2013
Interrater Agreement and Combining RatingsInterrater Agreement • November 29th, 2005
Contract Type FiledNovember 29th, 2005Some behaviors such as smiles require human raters for their measurement. A model of the rating process is explored that assumes that the probability distribution of overt rating responses depends on which of several underlying or latent responses occurred. The ideal of theoretically identical raters is considered and also departures from such identity. Methods for parameter estimation and assessing goodness of fit of the model are presented. A test of the hypothesis of identical raters is provided. Simulated data are used to explore different measures of agreement, optimal numbers of raters, how the ratings from multiple raters should be used to arrive at a final score for subsequent analysis, and the consequences of departures from the basic assumptions of identical raters and constant underlying response probabilities. The results indicate that often using two or three raters to rate all of the data, assessing the quality of their ratings by assessing their pairwise correlations, an
ContractInterrater Agreement • October 25th, 2024
Contract Type FiledOctober 25th, 2024Supplementary Table 6. KPPQ-Arabic Interrater agreement between first and second rater Rater A Rater B Kappa value p value ICC 95% CI KPPQ item Q. 1: Pain around joints (musculoskeletal) 21 (45.7) 21 (45.7) 1 <0.001 Q. 2: Pain related to internal organ 7 (15.2) 7 (15.2) 0.815 <0.001 Q. 3: Lower abdominal pain 16 (34.8) 15 (32.6) 0.936 <0.001 Q. 4: Pain deep within the body 7 (15.2) 6 (13.0) 0.903 <0.001 Q. 5: Dyskinetic pain 7 (15.2) 8 (17.4) 0.912 <0.001 Q. 6: “Off” dystonia in a region 16 (34.8) 16 (34.8) 1 <0.001 Q. 7: Generalized “off” period pain 9 (19.6) 9 (19.6) 1 <0.001 Q. 8: PLM or RLS-associated pain 15 (32.6) 14 (30.4) 0.935 <0.001 Q. 9: Pain while turning in bed 13 (28.3) 14 (30.4) 0.934 <0.001 Q. 10: Pain when chewing 5 (10.9) 4 (8.7) 0.807 <0.001 Q. 11: Pain due to grinding teeth 6 (13.0) 6 (13.0) 1 <0.001 Q. 12: Burning mouth syndrome 0 (0) 0 (0) - - Q. 13: Burning pain in the limbs 8 (17.4) 8 (17.4) 1 <0.001
Additional file 6. Interrater agreement for ultrasound signs in the derivation cohortInterrater Agreement • December 30th, 2021
Contract Type FiledDecember 30th, 2021M-mode time motion mode; PLAPS+ positive posterolateral alveolar and/or pleural syndrome. A kappa value (κ) ≥ 0.81 represents excellent agreement, 0.61 to 0.80 represents substantial agreement, 0.41 to 0.60 represents moderate agreement, 0.21 to 0.40 represents fair agreement, 0.01 to 0.20 represents slight agreement. A Spearman correlation coefficient (r) between 0.10 and 0.39 indicates weak, 0.4 and 0.69 moderate, and 0.70 and 0.89 a strong positive relationship [14, 15].