From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!sarah!cook!psinntp!psinntp!scylla!daryl Mon Mar  9 18:36:03 EST 1992
Article 4340 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!sarah!cook!psinntp!psinntp!scylla!daryl
>From: daryl@oracorp.com (Daryl McCullough)
Subject: Re: Definition of understanding
Message-ID: <1992Mar6.194405.22939@oracorp.com>
Organization: ORA Corporation
Date: Fri, 6 Mar 1992 19:44:05 GMT

christo@psych.toronto.edu (Christopher Green) writes:

[About the claim that there may be more than one system in one
physical body]

> If you really want your argument to rely wholly on the very dubious
> assumption that there are, somehow, two minds running around inside
> the man's head, feel free, but the utter tendentiousness of the claim
> is patently obvious to everyone not committed a priori to the belief
> that computers JUST GOTTA have minds.  In short, its nothing short of
> an ad hoc shoring up of a failing research program...

In my opinion, nothing could be farther from the truth. There is
nothing ad hoc about the claim that there could be several minds in
one head; it is a *necessary* consequence of (a) the Strong AI
position, and (b) the assumption that the man has memorized the
Chinese Room program.

On the other hand, Searle's assumption that memorization ensures that
there is only one mind strikes me as completely ad hoc and
unmotivated. In any case, Searle's assumption "one brain => one mind"
is certainly not obvious.

Daryl McCullough
ORA Corp.
Ithaca, NY


