From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Tue Mar 24 09:54:25 EST 1992
Article 4357 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: The Systems Reply I
Organization: Department of Psychology, University of Toronto
References: <1992Mar6.185926.18497@oracorp.com>
Message-ID: <1992Mar9.171606.6886@psych.toronto.edu>
Date: Mon, 9 Mar 1992 17:16:06 GMT

Chalmers has complained that discussion of the Chinese Room argument 
never advances to any great degree.  However, I think that Daryl presents
a common misconception that, when cleared away, might help to focus things
a little more on the crucial aspects of the main response to Searle, namely,
the Systems Reply.


In article <1992Mar6.185926.18497@oracorp.com> daryl@oracorp.com (Daryl McCullough) writes:

[WRT to Systems Reply]

>From the point of view of computationalism, Searle's response is
>completely off the mark. Searle assumes the following chain of
>reasoning:
>
>   One physical body => One system => One mind 
>   => The human doesn't understand => The system doesn't understand.
>
>The very first step, that one physical body implies one system, is
>simply false, and illustrates that Searle doesn't really have any idea
>what a "system" is (or at least he doesn't understand what is meant by
>"system" in the Systems Reply).

While Searle may very well believe that there is at most a one-to-one
correspondence between bodies and minds, this assumption is *not*
necessary for his argument to carry through.  The main objections that
he raises are just as successful (or unsuccessful) if we posit that minds
are the result of *many* physical bodies (taking neurons as "physical bodies",
this is already true), or even if minds are disembodied spirits. 

The crucial move that Searle makes it to assume that if a mind (however
constituted) performs computations that would generate subjective
experience (understanding, qualia, whatever), then the mind performing
the computations should have these experiences.  This is all that is
required for the counter to the Systems Reply.  This position may
very well be debatable, but it is important to note that it is a red
herring to concentrate on the possibility of two minds in one head.  The
counter itself does *not* require any assumptions about embodiment of minds.

It may very well be the case that the above argument, if correct, rules out
minds that *aren't* housed in a single physical body.  However, I am not
necessarily convinced that this is the case, and, in any event, this can only
be a conclusion drawn from the argument, and not a necessary premise.

- michael 



