Skip to main content
Developing Automated Deceptions and the Impact on Trust
Computer Science & Information Technology Faculty Publications
  • Frances Grodzinsky, Sacred Heart University
  • Keith W. Miller, University of Missouri - St Louis
  • Marty J. Wolf, Bemidji State University
Document Type
Publication Date
As software developers design artificial agents (AAs), they often have to wrestle with complex issues, issues that have philosophical and ethical importance. This paper addresses two key questions at the intersection of philosophy and technology: What is deception? And when is it permissible for the developer of a computer artifact to be deceptive in the artifact’s development? While exploring these questions from the perspective of a software developer, we examine the relationship of deception and trust. Are developers using deception to gain our trust? Is trust generated through technological “enchantment” warranted? Next, we investigate more complex questions of how deception that involves AAs differs from deception that only involves humans. Finally, we analyze the role and responsibility of developers in trust situations that involve both humans and AAs.

Published first online: February 18 2014.

Citation Information
Grodzinsky, Frances, Keith W. Miller, and Marty J. Wolf. "Developing Automated Deceptions and the Impact on Trust." Philosophy & Technology 28.1 (2015): 91-105.