Skip to main content
Article
Man vs. machine: Investigating the effects of adversarial system use on end-user behavior in automated deception detection interviews
Decision Support Systems
  • Jeffrey Gainer Proudfoot, Bentley University
  • Randall Boyle, Weber State University
  • Ryan M. Schuetzler, University of Nebraska at Omaha
Document Type
Article
Publication Date
3-1-2016
Abstract

Deception is an inevitable component of human interaction. Researchers and practitioners are developing information systems to aid in the detection of deceptive communication. Information systems are typically adopted by end users to aid in completing a goal or objective (e.g., increasing the efficiency of a business process). However, end-user interactions with deception detection systems (adversarial systems) are unique because the goals of the system and the user are orthogonal. Prior work investigating systems-based deception detection has focused on the identification of reliable deception indicators. This research extends extant work by looking at how users of deception detection systems alter their behavior in response to the presence of guilty knowledge, relevant stimuli, and system knowledge. An analysis of data collected during two laboratory experiments reveals that guilty knowledge, relevant stimuli, and system knowledge all lead to increased use of countermeasures. The implications and limitations of this research are discussed and avenues for future research are outlined

Comments

© 2016 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

Citation Information
Jeffrey Gainer Proudfoot, Randall Boyle and Ryan M. Schuetzler. "Man vs. machine: Investigating the effects of adversarial system use on end-user behavior in automated deception detection interviews" Decision Support Systems Vol. 85 (2016) p. 23 - 33
Available at: http://works.bepress.com/rschuetzler/10/