There is a need for assessment items that assess complex constructs but can also be efficiently scored for evaluation of teacher education programs. In an effort to measure the construct of teacher attentiveness in an efficient and scalable manner, we are using exemplar responses elicited by constructed-response item prompts to develop selected-response assessment items. Through analyses of think-aloud interview data, this study examines the alignment between participant responses to, and scores arising from, the two item types. The interview protocol was administered to 12 mathematics teachers and teacher candidates who were first presented a constructed-response version of an item followed by the selected-response version of the same item stem. Our analyses focus on the alignment between responses and scores for eight item stems across the two item types and the identification of items in need of modification. The results have the potential to influence the way test developers generate and use response process evidence to support or refute the assumptions inherent in a particular score interpretation and use.
Available at: http://works.bepress.com/ya-mo/8/