From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!sarah!cook!psinntp!psinntp!scylla!daryl Mon Mar  9 18:36:03 EST 1992
Article 4339 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!sarah!cook!psinntp!psinntp!scylla!daryl
>From: daryl@oracorp.com (Daryl McCullough)
Subject: The Systems Reply II
Message-ID: <1992Mar6.193001.20994@oracorp.com>
Organization: ORA Corporation
Date: Fri, 6 Mar 1992 19:30:01 GMT

The Systems Reply, in my opinion, is not a debating tactic on the part
of computationalists; it is at the heart of computationalism.
Physical objects do not possess (unique) mental properties, but
systems do. A human being is a particular system, with five senses as
the inputs, and a combination of vocal communication and body
movements as the outputs, the brain as memory, and the combination of
instinct and learning that makes up a human personality as the
dynamics relating inputs, outputs, and memory.

If the human memorizes a set of rules for manipulating Chinese
characters, then we can easily identify a second system: the inputs
and outputs for this system are Chinese characters, the memory is the
same human brain as before, and the dynamics are the memorized rules.
The Chinese speaker is a different system from the man who memorized
the rules by the definition of system. Also, by the definition of
system, the man's subjective sense of understanding is *not* a part of
the Chinese speaker, since it plays no functional role in the behavior
of the system. The fact that the man, when asked in English whether he
understands Chinese, will say no is not, a priori, at all relevant to
the question of whether the Chinese speaker implemented by the rules
understands Chinese.

Of course, some people will reason as follows: If one system can be
implemented on top of a second system, then wouldn't any mental
property possessed by the first system also be possessed by the second
system? This is the case with the Chinese Room, the Chinese Speaker
system is implemented on top of the English-speaking system, so
shouldn't the second system understand everything the first system
does? I think that this kind of reasoning is incorrect. It is
obviously the case that some properties possessed by a subsystem will
also necessarily be possessed by the supersystem. The question is
whether "having the subjective sense understanding" is such a
property.

I don't think that it is, because we have a subjective sense of
understanding something when we can relate it to other things we know,
our memories, our senses, our "model" of the world, etc. In the case
of the English speaker, he has no way to relate the Chinese words that
he knows to other things that he knows. He can't relate the English
word "hamburger" to anything in the Chinese speaker subsystem, and
similarly, he can't relate any Chinese words to any of his senses, or
to his childhood memories, or to any English words. However, he can,
through the rules that he has memorized, relate Chinese words to state
information in the Chinese speaker subsystem. In my opinion, it is
this lack of unity among the bits and pieces of information in the
person's brain that gives rise to a subjective sense of lack of
understanding.

For the Chinese speaker subsystem, there is no similar lack of unity,
since the English words, and the human's childhood memories are not
part of the system (and have no causal effect on the system). This
doesn't prove that the Chinese speaker subsystem would understand
Chinese (since we haven't even established that systems experience
anything at all). However, we have established that the Chinese
subsystem would have none of the reasons that the human does for
feeling the subjective sense of lack of understanding.

Thus the Systems Reply doesn't prove the Strong AI position, but it
does show that Searle's Chinese Room argument (without supplementary
arguments) has no force in disproving the Strong AI position.

Daryl McCullough
ORA Corp.
Ithaca, NY



