Newsgroups: sci.lang,sci.psychology,rec.arts.books,comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!nntp.club.cc.cmu.edu!miner.usbm.gov!rsg1.er.usgs.gov!stc06.ctd.ornl.gov!fnnews.fnal.gov!usenet.eel.ufl.edu!news.mathworks.com!news.kei.com!wang!news
From: bruck@actcom.co.il (Uri Bruck)
Subject: Re: Chomsky on Consciousness and Dennett (performance barriers)
Organization: ACTCOM - Internet Services in Israel
Date: Tue, 13 Jun 1995 22:15:32 GMT
Message-ID: <DA4ttx.IFt@actcom.co.il>
References: <JMC.95May29092827@SAIL.Stanford.EDU> <3qqqhd$g4j@percy.cs.bham.ac.uk> <802255266snz@troas.demon.co.uk> <3qv525$kh9@acmez.gatech.edu> <3r20sf$i9c@news.ox.ac.uk> <3r2kbr$s9r@nnrp.ucs.ubc.ca>
Sender: news@wang.com
Lines: 49
Xref: glinda.oz.cs.cmu.edu sci.lang:40097 sci.psychology:43147 comp.ai.philosophy:28830

Adam Constabaris (constab@unixg.ubc.ca) wrote:
: : Tim,

: : You wrote:
: : >Actually Searle fell a bit short of this.  He showed that such a system
: : >could replicate the behavior of an intelligence which understood *how to
: : >translate* Chinese.  This is quite different from understanding Chinese.

: : Good point, well made.

: Well, not quite :)  Searle's Chinese room, as I recall, is a set-up 
: which takes Chinese sentences on pieces of paper and outputs other 
: pieces of paper with Chinese sentences on them.  The person inside the 
: room has *instructions in English* which tell him or her what sort of 
: output to give for a given input.

: There's no translation of Chinese to English going on here, at least not 
: in the usual sense.  If there were, in fact, would not the person in the 
: room understand?

: AC

Searle claims that the person in the room performs a mechanical task
of converting one set of symbols to another, with no understanding of at
least one of those sets.
His conclusion is that a mechanical process can have no understanding.
But Searle's analogy includes a human being in the room, would it not
be fair to assume that a person who performs such a task would start 
remembering some rules, memorizing some sets of symbols and thus in the 
mechanical process of converting would start to 'learn' Chinese.
You may argue that he may start to learn reading it, but not speaking Chinese.
Well, there are many different dialects of Chinese and all of them
are written the same way, so learnig to speak the language is irrelevant,
resistance to learn is futile ;)

This entire post is pretty much besides any point Searle has ever made, violates
some rules of analogy and arguments by cheating.
The next part is more serious:
Searle assumes that traslation, or more specifically, machine translation,
can be accomplished by using a set of substitution rules, many people
on this group, and on the neighboring sci.lang.translation have read
posts or have had some  experience with machine translation.
I don't think it has been shown that unsupervised machine translation
of more than a simple phrase is viable, therefore  I woul;d like to challenge
the validity of the Chinese Room analogy as an analogy of a mechanical
process.
Uri Bruck
bruck@actcom.co.il

