From newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!olivea!charnel!rat!usc!sdd.hp.com!zaphod.mps.ohio-state.edu!rpi!scott.skidmore.edu!psinntp!psinntp!scylla!daryl Thu Oct  8 10:11:25 EDT 1992
Article 7134 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!olivea!charnel!rat!usc!sdd.hp.com!zaphod.mps.ohio-state.edu!rpi!scott.skidmore.edu!psinntp!psinntp!scylla!daryl
>From: daryl@oracorp.com (Daryl McCullough)
Newsgroups: comp.ai.philosophy
Subject: Re: Dualism
Message-ID: <1992Oct6.171057.26199@oracorp.com>
Date: 6 Oct 92 17:10:57 GMT
Organization: ORA Corporation
Lines: 32

In article <26534@castle.ed.ac.uk>, cam@castle.ed.ac.uk (Chris
Malcolm) writes:

>Many years ago to entertain some schoolkids I wrote a BASIC program
>based on ELIZA which was capable of having apparently sensible
>conversations with the user about how it worked. The point of the
>exercise was to present them with something that behaved as though it
>was intelligent and conscious, but which was capable of dispelling the
>illusion by explaining the simple trickery involved.

While it may be true that there could be a program that behaved as
though intelligent yet was not, ELIZA is not an example. Behaving as
if intelligent means making responses that are appropriate for an
intelligent being in all possible circumstances (for all possible
input/output histories, at least). The trickery involved in ELIZA is
that it behaves acceptably for a shrewdly guessed set of inputs, a set
that is likely to be the first things said to "her". If you try to go
beyond this very small set and say unexpected things, ELIZA's
responses quickly degenerate into nonsense. Taking a look at ELIZA's
code shows you the limitations of "her" abilities to respond
intelligently.

The assumption behind behaviorist AI is not that "If you can fool
someone into thinking a program is intelligent, then it is
intelligent", but "If the program behaves like an intelligent person
in all possible circumstances, then it is intelligent". While there
may be philosophical arguments against this position, people's
gullibility when interacting with ELIZA is no argument.

Daryl McCullough
ORA Corp.
Ithaca, NY


