Examiner Quality and Consistency across LanguageCert Writing Tests

This paper reports on a study of the training and standardisation of examiners who mark LanguageCert's International ESOL (IESOL) suite of English language tests linked to the Common European Framework of Reference (CEFR). Subjects in the study were a set of examiners (N=27] who had been markin...

Full description

Saved in:
Bibliographic Details
Published inInternational Journal of TESOL Studies Vol. 4; no. 1; p. 203
Main Authors Papargyris, Yiannis, Yan, Zi
Format Journal Article
LanguageEnglish
Published International TESOL Union 01.04.2022
Online AccessGet full text
ISSN2632-6779
2633-6898
DOI10.46451/ijts.2022.01.13

Cover

More Information
Summary:This paper reports on a study of the training and standardisation of examiners who mark LanguageCert's International ESOL (IESOL) suite of English language tests linked to the Common European Framework of Reference (CEFR). Subjects in the study were a set of examiners (N=27] who had been marking LanguageCert's IESOL Writing tests across the six CEFR levels. The focus of the study was on the consistency of marking in terms of severity within and across the six tests that the examiners mark.Correlations between examiner person measures across all six tests indicated that examiners were broadly consistent across tests, with examiner person measures generally correlating highly with their "partner" test: A1 with A2, C1 with C2, and B1 with B2 tests. LanguageCert examiners--who undergo careful training and standardisation--may therefore be seen to mark consistently and accurately across a range of ability levels. Keywords Examiner quality, examiner consistency, marking testing training LanguageCert
ISSN:2632-6779
2633-6898
DOI:10.46451/ijts.2022.01.13