This paper presents an empirical evaluation of automated writing evaluation (AWE) feedback used for L2 academic writing teaching and learning. It introduces the Intelligent Academic Discourse Evaluator (IADE), a new web-based AWE program that analyzes the introduction section to research articles and generates immediate, individualized, and discipline-specific feedback. The purpose of the study was to investigate the potential of IADE’s feedback. A mixed-methods approach with a concurrent transformative strategy was employed. Quantitative data consisted of responses to Likert-scale, yes/no, and open-ended survey questions; automated and human scores for first and final drafts; and pre-/posttest scores. Qualitative data contained students’ first and final drafts as well as transcripts of think-aloud protocols and Camtasia computer screen recordings, observations, and semistructured interviews. The findings indicate that IADE’s colorcoded and numerical feedback possesses potential for facilitating language learning, a claim supported by evidence of focus on discourse form, noticing of negative evidence, improved rhetorical quality of writing, and increased learning gains.
Available at: http://works.bepress.com/elena_cotos/3/