Common Contracts

1 similar Author Manuscript contracts

Assessing Inter-Annotator Agreement for Medical Image Segmentation
Author Manuscript • April 4th, 2023

Artificial Intelligence (AI)-based medical computer vision algorithm training and evaluations depend on annotations and labeling. However, variability between expert annotators introduces noise in training data that can adversely impact the performance of AI algorithms. This study aims to assess, illustrate and interpret the inter-annotator agreement among multiple expert annotators when segmenting the same lesion(s)/abnormalities on medical images. We propose the use of three metrics for the qualitative and quantitative assessment of inter-annotator agreement: 1) use of a common agreement heatmap and a ranking agreement heatmap; 2) use of the extended Cohen’s kappa and Fleiss’ kappa coefficients for a quantitative evaluation and interpretation

AutoNDA by SimpleDocs
Draft better contracts in just 5 minutes Get the weekly Law Insider newsletter packed with expert videos, webinars, ebooks, and more!