From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!uwm.edu!linac!midway!ellis!gal2 Tue Nov 19 11:09:34 EST 1991
Article 1259 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!uwm.edu!linac!midway!ellis!gal2
>From: gal2@ellis.uchicago.edu (Jacob Galley)
Subject: Chinese Room, from a different perspective
Message-ID: <1991Nov11.011527.28514@midway.uchicago.edu>
Keywords: ai philosophy searle expert system
Sender: news@midway.uchicago.edu (NewsMistress)
Organization: University of Chicago
References: <1991Nov7.151439.3353@osceola.cs.ucf.edu>
Date: Mon, 11 Nov 1991 01:15:27 GMT

This is just something that occurred to me a while ago. I haven't studied
up on the Chinese Room problem, so maybe this argument has already been
made:

There's this guy in a room with a set of instructions for translating
(or communicating in) Cantonese, isn't that how it goes? Said guy has
absolutely no former knowledge of Cantonese, but he knows how to follow
the instructions.

Searle sets up this situation and then argues that the guy still does
not know Cantonese. He then makes and analogy to the strong-AI problem,
and concludes that a computer can never "understand" or have a grip on
meaning in the way us humans do.

I argue: His assertion that the guy doesn't know Cantonese is fine, but
what about the guy-room system together, as one entity. Doesn't IT know
Cantonese? I'm sorry that I don't have the specifics on hand, but it seems
that Searle set up the Chinese Room so that it simulated intelligence
(and proficiency in Cantonese) and then pulled out one part of it (the
guy) and asked if that part was really intelligent. Remember Aunt Hillary
in Hofstaeder's _Goedel Escher Bach_? She was a ant hill who was intelligent
even though the individual ants and communications they used were not
intelligent. "The Whole is Greater than the Sum of its Parts" as it were.

Isn't Searle not seeing this in his Chinese Room? The guy is only the 
CPU of the guy-room system. The analogy between the guy and an intelligent
machine is not well formed.

-- 
Here is the address to complain to:
             Jacob Galley, a full-time student with a part-time reality check
						     gal2@midway.uchicago.edu


