From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!wupost!uunet!mcsun!uknet!edcastle!aifh!bhw Tue Jan 28 12:15:37 EST 1992
Article 2998 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!wupost!uunet!mcsun!uknet!edcastle!aifh!bhw
>From: bhw@aifh.ed.ac.uk (Barbara H. Webb)
Newsgroups: comp.ai.philosophy
Subject: Re: Intelligence testing
Message-ID: <1992Jan22.104726.18897@aifh.ed.ac.uk>
Date: 22 Jan 92 10:47:26 GMT
References: <1992Jan14.015806.23985@oracorp.com> <5982@skye.ed.ac.uk> <1992Jan15.185342.11589@aifh.ed.ac.uk> <5993@skye.ed.ac.uk> <1992Jan16.122937.23838@aifh.ed.ac.uk> <6000@skye.ed.ac.uk> <1992Jan17.161938.20312@aifh.ed.ac.uk> <6013@skye.ed.ac.uk> <1992
 <6024@skye.
Reply-To: bhw@aifh.ed.ac.uk (Barbara H. Webb)
Organization: Dept AI, Edinburgh University, Scotland
Lines: 57

In article <6024@skye.ed.ac.uk> jeff@aiai.ed.ac.uk (Jeff Dalton) writes:

I'm afraid I don't have time to reply piece by piece to your article,
and besides I think any ideas that this thread might contain are getting
buried under excess verbiage. Instead I will try to clearly and
concisely set out my points, and hopefully pinpoint more accurately the
points at which you disagree. If you would care to do the same, perhaps
we can then leave it up to the readers of this thread to make up their
own minds?

 * Several people have said that the Turing test is bad because it is
behaviourist (and everyone knows Behaviourism is Bad).

 * Behaviourism is generally considered to be bad (and rejected in favour
of cognitive psychology) because it denies that mentality and/or
cognitive processes have any explanatory role for human behaviour.
 
 * Accepting the Turing test does not require denying that mentality has
an explanatory role for human behaviour: in fact the idea that "the
behaviour is strong evidence for the mentality" seems to follow quite
obviously from the idea that "mentality is involved in any plausible
explanation of the behaviour". Of course, this reasoning doesn't make
the Turing test _sufficient_ because in principle there could be an
alternative way the behaviour could come about. But such alternatives
may be considered so unlikely that the Turing test may be taken to be
sufficient _in practice_.

  I admit (I think I did already) that my initial statement that
accepting the Turing test was incompatible with Behaviourism was too
strong. A Behaviourist might accept the test because they consider the
behaviour to be all there is. However, I don't think that the pragmatic
approach of "If my computer passes the Turing Test, I don't care if it
really thinks or not" is equivalent to adopting this behaviourist
outlook, because it says nothing at all about what sort of things may be
involved in explaining the behaviour (sufficiently so to imitate it). I
think this is one of the main places where Jeff would disagree, i.e. he
would say that the pragmatic approach is a behaviourist one.

 * Rejecting the Turing test is to say (at the very least) "the
behaviour is not sufficient evidence for the mentality". It seems to
directly follow from this that "it is concievable that some alternative
means of obtaining the behaviour exists". I thought Jeff was disputing
this step, but I now suspect what he was objecting to was the stronger
statement that "rejecting the Turing test requires a coherent concept of
an alternative means of obtaining the behaviour".

Now, I realise that this is not required if all you want to do is to
point out that the Turing Test is _in principle_ insufficient. However,
arguing that the Turing test is insufficient in practice does raise this
problem. But if someone can propose a coherent alternative means (such
as Searle's 'meaningless symbol manipulation') for obtaining the
behaviour, then this constitutes an alternative explanation for the
behaviour in humans as well, which creates the new problem of explaining
why the alternative is plausible for computers but not for humans. I
don't think Searle has adequately explained this.

BW


