From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!jupiter!morgan.ucs.mun.ca!nstn.ns.ca!aunro!ukma!hri.com!spool.mu.edu!munnari.oz.au!metro!cluster!minnie.cs.su.OZ.AU!timc Mon Dec  9 10:48:07 EST 1991
Article 1872 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!jupiter!morgan.ucs.mun.ca!nstn.ns.ca!aunro!ukma!hri.com!spool.mu.edu!munnari.oz.au!metro!cluster!minnie.cs.su.OZ.AU!timc
>From: timc@minnie.cs.su.OZ.AU (Tim Brabin Cooper)
Newsgroups: comp.ai.philosophy
Subject: The Chinese Room Fallacy
Keywords: Searle, Chinese
Message-ID: <3728@cluster.cs.su.oz.au>
Date: 5 Dec 91 04:58:47 GMT
Sender: news@cluster.cs.su.oz.au
Lines: 52


	Searle's argument is easily seen to be wrong. He obviously doesn't
understand what an emergent process is. (He almost says as much when saying
that it's "absurd" that the combination of man + rule-books has properties
that neither has individually).

			--oOo--

	The thing is, in such a situation, the system of man + books + tablets
IS an intelligent system, or at least one which understands chinese in a
very real way. This concept of regarding the system as a whole is probably
unfamiliar to most philosophers, but to programmers it is the most natural
thing in the world.

			--oOo--

	Searle's answer to that is to say, "Theoretically, the man could
internalise the set of rules & procedures" (i.e. by memorising them). But
then the man would embody the entire system & so he would understand Chinese!
Searle's argument does not show in this situation that there is
no understanding. To be precise, you could argue that the man's mind now
exists at two levels of abstraction, that which manipulates the rules, and
the higher level which emerges from the rules, which does understand Chinese.


	|----------------------------------------------------|

	The idea that understanding can emerge from simple rules is not as
counter-intuitive as it seems at first. An AI-programmer would start thinking
about how to make the rules which the man uses. There would be tablets
corresponding to words and tablets corresponding to purely internal concepts,
there would be a procedure for parsing sentences, procedures to link up the
words with all the other associations & entities that they relate to, (This
is to get the 'meaning' of the words), procedures for following the meanings
through to 'responses', either predicate-logic based or connectionist-based
or based on some other system), and procedures for converting the internal
representations of concepts into words. Anyone can see how this broad
outline could work, and there is nothing mystical or magical going on, nor
is there any 'cheating' at understanding.
	This doesn't prove that understanding occurs, since that would be
begging the question, but I've tried to make the idea of complex organised
behaviour emerging from simple rules seem more intuitive. The definition
of 'understanding' that I accept is this (adapted to this situation):

	A system understands Chinese if it can stand up to a large degree
	of probing and behave like a typical Chinese speaker, which in
	turn it can only do if the computation manipulates concepts which
	are intrinsic to the Chinese language and human culture.

	(PS-What is Searle's definition?)

			--oOo--


