Skip to main content
Article
Evaluation of Restricted Domain Question-Answering Systems
The School of Information Studies: Faculty Scholarship
  • Anne R. Diekema, Utah State University
  • Ozgur Yilmazel, Syracuse University
  • Elizabeth D. Liddy, Syracuse University
Document Type
Article
Date
1-1-2004
Keywords
  • question-answering,
  • QA,
  • question-answering evaluation,
  • open-domain systems,
  • TREC QA,
  • restricted-domain system,
  • tset question development,
  • answer key creation,
  • test collection construction
Language
English
Description/Abstract

Question-Answering (QA) evaluation efforts have largely been tailored to open-domain systems. The TREC QA test collections contain newswire articles and the accompanying queries cover a wide variety of topics. While some apprehension about the limitations of restricted-domain systems is no doubt justified, the strict promotion of unlimited domain QA evaluations may have some unintended consequences. Simply applying the open domain QA evaluation paradigm to a restricted-domain system poses problems in the areas of test question development, answer key creation, and test collection construction. This paper examines the evaluation requirements of restricted domain systems. It incorporates evaluation criteria identified by users of an operational QA system in the aerospace engineering domain. While the paper demonstrates that user-centered task-based evaluations are required for restricted domain systems, these evaluations are found to be equally applicable to open domain systems.

Citation Information
Anne R. Diekema, Ozgur Yilmazel and Elizabeth D. Liddy. "Evaluation of Restricted Domain Question-Answering Systems" (2004)
Available at: http://works.bepress.com/anne_diekema/45/
Creative Commons License
Creative Commons Attribution 3.0