Skip to main content
Contribution to Book
Safety Intelligence and Legal Machine Language: Do we need the Three Laws of Robotics?
Service Robot Applications (2008)
  • Yueh-Hsuan Weng, NCA, Ministry of the Interior, Republic of China
  • Chien-Hsun Chen, National Nano Device Laboratories
  • Chuen-Tsai Sun, National Chiao Tung University
Abstract
The aim of this chapter is to offer a fundamental framework for a legal system focused on safety issues involving New Generation Robots (NGRs). This framework is offered in response to the lack of clarity regarding robot safety guidelines despite the impending development and release of tens of thousands of robots into workplaces and homes around the world. The authors propose a Safety Intelligence (SI) concept for NGRs that addresses issues tied to open texture risk for robots that will have a relatively high level of autonomy in interactions with humans. We express doubt that Asimov’s Three Laws of Robotics model will be a suitable foundation for creating an artificial moral agency for ensuring robot safety. Instead, we will offer an alternative Legal Machine Language (LML) model that utilizes non-verbal information from robot sensors and actuators to protect both humans and robots. To implement an LML model, robotists must design a biomorphic nerve reflex system, and legal scholars must define safety content for robots having a certain degree of “self-awareness
Keywords
  • Robot Safety,
  • Service Robots
Disciplines
Publication Date
Spring May 15, 2008
Editor
Yoshihiko Takahashi
Publisher
InTech Education and Publishing, ISBN 978-953-7619-00-8
Citation Information
Yueh-Hsuan Weng, Chien-Hsun Chen and Cheun-Tsai Sun,” Safety Intelligence and Legal Machine Language-Do we need the Three Laws of Robotics? ”, in Yoshihiko Takahashi (Ed.) Service Robot Applications, Vienna: InTech Education & Publishing , August 2008. ISBN 978-953-7619-00-8 Available at: http://works.bepress.com/weng_yueh_hsuan/3