Newsgroups: comp.ai
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!usc!news.isi.edu!gremlin!charming!mcohen
From: mcohen@charming.nrtc.northrop.com (Martin Cohen)
Subject: Re: Asimov's Robotic Laws?
Message-ID: <D05774.LxB@gremlin.nrtc.northrop.com>
Sender: news@gremlin.nrtc.northrop.com (Usenet News Manager)
Organization: Northrop Grumman Automation Sciences Laboratory, Pico Rivera, CA
References: <3atvgh$4if@netaxs.com> <3avou2$h1v@news.acns.nwu.edu> <1994Nov26.075248.8687@sol.UVic.CA>
Date: Thu, 1 Dec 1994 17:28:15 GMT
Lines: 47

>>>: The Three Laws of Robotics
>>>
>>>: 1.  A robot may not injure a human being, or, through inaction, allow a
>>>: human being to come to harm.
>>>
>>>: 2.  A robot must obey the orders given it by human beings except where
>>>: such orders would conflict with the First Law.
>>>
>>>: 3.  A robot must protect its own existence as long as such protection does
>>>: not conflict with the First or Second Law.
>>>
>>>:                            Handbook of Robotics
>>>:                            56th Edition, 2058 A.D.
>>>
>>>There must be a mistake..  If anyone asks this robot to jump out the 
>>>window it would just do it..  
>>>There must be several more important laws..
>>>
>
>I have to disagree with this. The robot would clearly not jump out window as
>this is in direct violation with law 3. Now while the robot has to agree with
>laws 1&2 they still have the overriding law of self preservation that is descibed
>in law 3. Only if the jumping out of the window saved a human life would the act
>be justified. that's just my call.
>
>Evan Wise
>Ewise@sol.uvic.ca

Nope - as long as the command (to self-destruct) did not conflict
with any of the higher laws it would be obeyed. For example,
if there was a human where the robot would land, the robot
could refuse. Or, if the robot had been given a larger than
usual value for its own value (strengthen law 3) and the
command to self-destruct was weak ("aw, go play on the freeway"),
then the command might be resisted.

But if the command to self-destruct was very strong and no
human was endangered, the command would be obeyed.

This kind of conflict provided the basis for a number
of Asimov's robot stories.


-- 
Marty Cohen (mcohen@nrtc.northrop.com) - Not the guy in Philly
  This is my opinion and is probably not Northrop Grumman's!
          Use this material of your own free will
