Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!ix.netcom.com!netcom.com!jqb
From: jqb@netcom.com (Jim Balter)
Subject: Re: Please outline Turing Test
Message-ID: <jqbCzKDpy.K6J@netcom.com>
Organization: NETCOM On-line Communication Services (408 261-4700 guest)
References: <zpc325aAc3pWZf0@pp78hsp.hsp.zer.de> <CzFGJu.E9L@spss.com> <3alnie$f0r@crl2.crl.com>
Date: Sun, 20 Nov 1994 11:39:34 GMT
Lines: 39

In article <3alnie$f0r@crl2.crl.com>, Andrea Chen <dbennett@crl.com> wrote:
>
>
>For many people, a sixties program called "Eliza" passed the
>Turing test.  It was taken as a "real friend".

Since these people didn't know that there was any issue concerning the
intelligence of Eliza, they assumed that it was intelligent and interpreted
its behavior in terms of that cognitive model.  That's quite different from
the TT.

>The result drove the programs author from being a leader in
>AI to being a leading (often irrational) critic.

Irrational indeed.

>I find the result fascinating from an ethical viewpoint.  A
>set of rather simple algorithms convinced people that a machine
>was more caring than most humans and they opened up and quite
>possibly learned a lot about themselves in the process.

There are a number of therapy modalities in whence the therapist is
virtually silent, speaking only when the client's self-dialogue ceases
or becomes circular.  The therapist need not be a problem solver or
a source of empathy.  Socrates is subject to mechanization.

>Those of us who wish to be "moral" should learn that this
>may not come from the "goodness in our hearts",  but from
>learning a simple set of behaviors.  Taboo breaking, but
>important.

Morality is received as rules for behavior.  It becomes
"the goodness in our hearts" when we internalize it, so that we experience
internal discomfort about violating moral rules.  How does this break taboos?
Few hold that we are born as moral beings, with goodness in our hearts.


-- 
<J Q B>
