From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rutgers!rochester!cantaloupe.srv.cs.cmu.edu!crabapple.srv.cs.cmu.edu!andrew.cmu.edu!fb0m+ Mon Dec  9 10:47:45 EST 1991
Article 1833 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rutgers!rochester!cantaloupe.srv.cs.cmu.edu!crabapple.srv.cs.cmu.edu!andrew.cmu.edu!fb0m+
>From: fb0m+@andrew.cmu.edu (Franklin Boyle)
Newsgroups: comp.ai.philosophy
Subject: Re: Chinese Room, from a different perspective
Message-ID: <4dCxHKK00WBKA3VAgN@andrew.cmu.edu>
Date: 3 Dec 91 19:02:46 GMT
Organization: Cntr for Design of Educational Computing, Carnegie Mellon, Pittsburgh, PA
Lines: 40

Mark Schnitzius writes:

> Perhaps this is it (and this has always been my argument with
> the Chinese Room analogy).  The question of whether the man knows
> Chinese is simply irrelevant to the argument.  The man is simply
> part of the system, where the man plus the rules define the system.
> Saying that the man does not know Chinese is like saying that a
> brain cell in your brain is not intelligent, or that a computer
> which happens to be running an intelligent program is not of itself
> intelligent.  The system as a whole must be considered.

Based on this, I'm not sure you've read Searle, but suffice it to
say that, because of the "systems reply", he did allow that the man
in the room could interalize what was in the book, so that the man
(everything from his skin inward) *is* the system (unless you feel
that the cinder blocks of which the walls of the room are made somehow
add to the system's understanding).  If you allow that the man, once
he has memorized the rules, and the rules are (still) two subsystems,
Searle discusses that also.

I guess I just don't see the problem the people arguing against Searle
have. Let's suppose you have memorized Searle's hypothesized rule book
(so what you have is a lot of associations between different symbols and
groupings of symbols as well as chains of such associations).  You have *no*
idea what the symbols refer to, however, and, therefore, what they or
groupings of them mean.  Now, let someone write you a message in Chinese
asking you what colors daffodils are. Then let someone write you a message
in English asking the same question.  Do you "understand" both questions?
(note: this is different than; Can you respond to both questions with the
appropriate answer?). Presumably you didn't understand the first question 
(at least I wouldn't). Why do you think you didn't?

Consider the problem *physically*.  Do you think you can determine what
trees, for example, look like or feel like from the *physical* structures
of all the symbols and symbol strings that have to do with trees? (remember,
*you* don't know what those symbols or symbol strings mean, so *the only* 
possible sources of information are the physical structures of the symbol
strings and associations between those physical structures.)

-Frank


