From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Tue Mar 24 09:56:01 EST 1992
Article 4481 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: The Systems Reply I
Organization: Department of Psychology, University of Toronto
References: <1992Mar14.200214.28723@oracorp.com>
Message-ID: <1992Mar16.230945.3769@psych.toronto.edu>
Date: Mon, 16 Mar 1992 23:09:45 GMT

In article <1992Mar14.200214.28723@oracorp.com> daryl@oracorp.com (Daryl McCullough) writes:
>michael@psych.toronto.edu (Michael Gemar) writes:
>
>> While Searle may very well believe that there is at most a one-to-one
>> correspondence between bodies and minds, this assumption is *not*
>> necessary for his argument to carry through. The main objections that
>> he raises are just as successful (or unsuccessful) if we posit that
>> minds are the result of *many* physical bodies (taking neurons as
>> "physical bodies", this is already true), or even if minds are
>> disembodied spirits.
>
>Searle claims that, in the Chinese Room, the man *is* the system, and
>he uses this claim to deduce his conclusion that there is no
>understanding in the Chinese Room. There might indeed be a similar
>argument that doesn't depend on this claim, but it is not Searle's
>argument.

I disagree that that Searle makes *any* use of the embodiment requirement.
The whole force of the argument is carried by the premise discussed below.

>> The crucial move that Searle makes it to assume that if a mind
>> (however constituted) performs computations that would generate
>> subjective experience (understanding, qualia, whatever), then the mind
>> performing the computations should have these experiences.
>
>The crucial question is: why does Searle assume this? It is not
>because he believes it is true (he doesn't believe that computations
>can generate subjective experience in any case). It is not because
>Strong AI proponents think it is true.

I'm not so sure that this was the case *before* Searle made his original
argument.  In any event, Searle believes it to be a consequence of 
the Strong-AI position.  I am coming more and more to believe that he
is wrong, as I begin to better understand the Systems Reply (but this
still does not address the issue of generating semantics from syntax). 

> So why does he introduce this
>assumption that absolutely nobody believes? It is a crucial step in
>his disproof of Strong AI, and it is completely unmotivated, as far as
>I can see.
>
>In previous posts, I have given reasons for why I think that the
>Chinese Room system might have a subjective feeling of understanding,
>while the man who memorized the rules would not: mainly, the
>subjective feeling of understanding involves the unity and coherence
>of concepts. The English-speaking mind has no such coherence, since it
>can't relate its English thoughts with the Chinese symbols it is
>manipulating.

Hmm...what happens when the English speaker takes a course in Chinese?  Can 
he suddenly "read the mind" of the "system"?  Or does the system "mind" 
collapse into a single mind with his?  This kind of thing seems problematic...

>
>> This is all that is required for the counter to the Systems Reply. This
>> position may very well be debatable, but it is important to note that
>> it is a red herring to concentrate on the possibility of two minds in
>  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>> one head. The counter itself does *not* require any assumptions about
>  ^^^^^^^^
>> embodiment of minds.
>
>It is not a red herring. The central issue in the Systems Reply is
>that the understanding of the Chinese Room system neither implies nor
>is implied by the understanding of the man following the rules.

But it *is* a red herring, since Searle claims (wrongly or rightly) that
the *mind* of the man should experience the *mind* of the system.  No
bodies are needed here.  The same claim could be made if the man were
a disembodied spirit.

- michael



