Skip to main content
Article
Why Can’t it Mark this one? A Qualitative Analysis of Student Writing Rejected by an Automated Essay Scoring System
English in Australia (2018)
  • Nathanael Reinertsen, Australian Council for Educational Research (ACER)
Abstract
The difference in how humans read and how Automatic Essay Scoring (AES) systems process written language leads to a situation where a portion of student responses will be comprehensible to human markers, while being unable to be parsed by AES systems. This paper examines a number of pieces of student writing that were marked by trained human markers, but subsequently rejected by an AES system during the development of a scoring model for the eWrite online writing assessment that is offered by The Australian Council for Educational Research. The features of these ‘unscoreable’ responses are examined through a qualitative analysis. The paper reports on the features common to a number of the rejected scripts, and considers the appropriateness of the computer-generated error codes as descriptors of the writing. Finally, it considers the implications of the results for teachers using AES in assessing writing.
Keywords
  • Automated marking errors,
  • Assessing writing,
  • Automated essay scoring,
  • Written language,
  • Student writing
Publication Date
2018
Citation Information
Nathanael Reinertsen. "Why Can’t it Mark this one? A Qualitative Analysis of Student Writing Rejected by an Automated Essay Scoring System" English in Australia Vol. 53 Iss. 1 (2018) p. 52 - 60 ISSN: 0046-208X
Available at: http://works.bepress.com/nathanael-reinertsen/1/