From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!think.com!wupost!uunet!psinntp!scylla!daryl Tue Mar 24 09:55:45 EST 1992
Article 4458 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!think.com!wupost!uunet!psinntp!scylla!daryl
>From: daryl@oracorp.com (Daryl McCullough)
Subject: Re: The Systems Reply I
Message-ID: <1992Mar14.200214.28723@oracorp.com>
Organization: ORA Corporation
Date: Sat, 14 Mar 1992 20:02:14 GMT

michael@psych.toronto.edu (Michael Gemar) writes:

> While Searle may very well believe that there is at most a one-to-one
> correspondence between bodies and minds, this assumption is *not*
> necessary for his argument to carry through. The main objections that
> he raises are just as successful (or unsuccessful) if we posit that
> minds are the result of *many* physical bodies (taking neurons as
> "physical bodies", this is already true), or even if minds are
> disembodied spirits.

Searle claims that, in the Chinese Room, the man *is* the system, and
he uses this claim to deduce his conclusion that there is no
understanding in the Chinese Room. There might indeed be a similar
argument that doesn't depend on this claim, but it is not Searle's
argument.

> The crucial move that Searle makes it to assume that if a mind
> (however constituted) performs computations that would generate
> subjective experience (understanding, qualia, whatever), then the mind
> performing the computations should have these experiences.

The crucial question is: why does Searle assume this? It is not
because he believes it is true (he doesn't believe that computations
can generate subjective experience in any case). It is not because
Strong AI proponents think it is true. So why does he introduce this
assumption that absolutely nobody believes? It is a crucial step in
his disproof of Strong AI, and it is completely unmotivated, as far as
I can see.

In previous posts, I have given reasons for why I think that the
Chinese Room system might have a subjective feeling of understanding,
while the man who memorized the rules would not: mainly, the
subjective feeling of understanding involves the unity and coherence
of concepts. The English-speaking mind has no such coherence, since it
can't relate its English thoughts with the Chinese symbols it is
manipulating.

> This is all that is required for the counter to the Systems Reply. This
> position may very well be debatable, but it is important to note that
> it is a red herring to concentrate on the possibility of two minds in
  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> one head. The counter itself does *not* require any assumptions about
  ^^^^^^^^
> embodiment of minds.

It is not a red herring. The central issue in the Systems Reply is
that the understanding of the Chinese Room system neither implies nor
is implied by the understanding of the man following the rules.

Daryl McCullough
ORA Corp.
Ithaca, NY



