From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!zaphod.mps.ohio-state.edu!news.acns.nwu.edu!speedy.acns.nwu.edu!learn Tue May 12 15:50:35 EDT 1992
Article 5576 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!zaphod.mps.ohio-state.edu!news.acns.nwu.edu!speedy.acns.nwu.edu!learn
>From: learn@speedy.acns.nwu.edu (William J. Vajic)
Subject: Re: AI failures
Message-ID: <1992May12.110810.9769@news.acns.nwu.edu>
Sender: usenet@news.acns.nwu.edu (Usenet on news.acns)
Organization: Dares No Organization Like Dis Organization
References: <umpm0INNpv8@early-bird.think.com> <1992May12.213451.5026@csc.canterbury.ac.nz>
Date: Tue, 12 May 1992 11:08:10 GMT
Lines: 38

In <1992May12.213451.5026@csc.canterbury.ac.nz> chisnall@cosc.canterbury.ac.nz 
(The Technicolour Throw-up) writes:

>From  article  <umpm0INNpv8@early-bird.think.com>,  by  Hans Moravec writes:

>> When I make an AI, I will be sure to construct it so it wants,
>> passionately, to do only and exactly what I want it to do.
>> So if you *ask it*, it will say just that.

Given tht humans have, for as long as they existed, been attempting to
do this with children, and have almost invariably failed, I don't think
yuor scenario is a simple an achievement as this depicts.

>Now it so happens that your  base  has  a  neat  whizzo  Star  Trek  like
>duplicater  that  can be used to duplicate, almost instantly, and with no
>deleterious side effects to the base (or its  energy  supply),  anything,
>living  or dead, that is placed within it.  Amongst other things this can
>be used to duplicate people.
[......]
>Part  of  this  turmoil  arises  because  we  are  in  effect  destroying
>individuals and we hold it as a right  that  individuals  should  not  be
>deliberately  destroyed.  But duplication removes this worry.  As long as
>either the original or one of the copies remains around afterwards  there
>is  still continuity of that individual.

Dangerfield recently told us about a dog he once had. The dog realized it
looked like its owner and committed suicide.

The other side of this same problem would happen given replication in 
order to undertake a "deadly task." Seems to me your clones would, given
equal intelligence and outlooks on life, the universe, and everything,
insist on making clones of themselves to take their place in death from
the mission at hand.

And so it goes........

Bill Vajk   |    (The name was mangled in translation.)



