From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!qt.cs.utexas.edu!cs.utexas.edu!uunet!paladin.american.edu!darwin.sura.net!udel!cis.udel.edu Tue Nov 19 11:09:49 EST 1991
Article 1287 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!qt.cs.utexas.edu!cs.utexas.edu!uunet!paladin.american.edu!darwin.sura.net!udel!cis.udel.edu
>From: lintz@cis.udel.edu (Brian Lintz)
Newsgroups: comp.ai.philosophy
Subject: Re: Chinese Room, from a different perspective
Keywords: ai philosophy searle expert system
Message-ID: <70105@nigel.ee.udel.edu>
Date: 12 Nov 91 17:38:05 GMT
References: <1991Nov7.151439.3353@osceola.cs.ucf.edu> <1991Nov11.011527.28514@midway.uchicago.edu>
Sender: usenet@ee.udel.edu
Organization: University of Delaware
Lines: 21
Nntp-Posting-Host: buster.cis.udel.edu

In article <1991Nov11.011527.28514@midway.uchicago.edu> gal2@ellis.uchicago.edu (Jacob Galley) writes:

[Brief Chinese room summary deleted]

>I argue: His assertion that the guy doesn't know Cantonese is fine, but
>what about the guy-room system together, as one entity. Doesn't IT know
>Cantonese? I'm sorry that I don't have the specifics on hand, but it seems
>that Searle set up the Chinese Room so that it simulated intelligence
>(and proficiency in Cantonese) and then pulled out one part of it (the
>guy) and asked if that part was really intelligent. 

I always thought basically the same thing. Searle starts with the
premise that the man doesn't know Chinese, and ends with the 
conclusion that the man doesn't know Chinese. He assumed what he
intended to prove. 

>             Jacob Galley, a full-time student with a part-time reality check
>						     gal2@midway.uchicago.edu

Brian Lintz
lintz@udel.edu


