Labeler Agreement Sample Contracts

Labeler Agreement in Transcribing Korean Intonation with K-ToBI
Labeler Agreement • September 13th, 2003

This paper reports labeler agreement in the transcription of Korean prosody using Korean ToBI (K-ToBI) [9]. Twenty utterances representing five different types of speech were produced by 18 speakers and transcribed by 21 labelers differing in their levels of experience with K-ToBI. Following the stringent metric used for English ToBI evaluation [14,12], consistency was measured in terms of the number of transcriber pairs agreeing on the labeling of each particular word. The results show that for tonal transcriptions of the 32,130 transcriber-pair-words, agreement was 77% for the type of boundaries at the end of each word (i.e., word, AP, or IP), 78% for AP boundaries, and 91% for IP boundaries. For break indices, the agreement score for exact matching in the labeling was 59%, 69% when relaxing the presence/absence of diacritics, and 99% when relaxing within +/-1 level. In sum, the data confirm that the conventions of K-ToBI are adequate, easy to learn, and can be reliably used for rese

AutoNDA by SimpleDocs
LABELER AGREEMENT IN
Labeler Agreement • November 7th, 2007

This paper analyzes inter-labeler agreement of label choice and boundary placement for human phonetic transcriptions of continuous telephone speech in di erent languages. In experiment one, English, German, Mandarin and Spanish are labeled by uent speakers of the languages. In experi- ment two, German and Hindi are labeled by linguists who do not speak the languages. Experiment two uses a some- what ner phonetic transcription set than experiment one. We compare the transcriptions of the utterances in terms of the minimum number of substitutions, insertions and dele- tions needed to map one transcription to the other. Native speakers agree on the average 67.52% of the time at the

Time is Money Join Law Insider Premium to draft better contracts faster.