From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!bronze!chalmers Thu Jan 16 17:19:50 EST 1992
Article 2658 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!bronze!chalmers
>From: chalmers@bronze.ucs.indiana.edu (David Chalmers)
Subject: Re: Searle and the Chinese Room
Message-ID: <1992Jan12.214251.21761@bronze.ucs.indiana.edu>
Organization: Indiana University
References: <5909@skye.ed.ac.uk> <1992Jan10.005426.24694@bronze.ucs.indiana.edu> <5949@skye.ed.ac.uk>
Date: Sun, 12 Jan 92 21:42:51 GMT
Lines: 79

In article <5949@skye.ed.ac.uk> jeff@aiai.UUCP (Jeff Dalton) writes:

>For the rest, the way in which a recipe is analogous to a program
>is that it is a set of instructions that can be followed by a
>computer/person for manipluating ingrediants in certain ways.
>What about programs is analogous to saying what the ingrediants
>are?

As with every analogy, one shouldn't expect every element to carry
though.  The main point of the analogy is that recipes and programs
are both syntactic objects that effectively act as specifications or
descriptions for physical systems, and for which there exist
implementation procedures by which one can go from the syntactic
object to the physical system.  Blueprints and houses might work
as well as recipes and cakes.

>In any case, the only way a recipe can specify the physico-chemical
>properties of a cake is by relying on the physico-chemical properties
>of the ingrediants.  The recipe doesn't produce something that's
>crumbly, a physical property, without the aid of something that
>already has physical properties.

Indeed: and by analogy, the program doesn't produce something that
has causal organization without the aid of something (an implementing
device) that already has causal organization.

>The recipe producing something
>crumbly is supposed to map to the program producing something
>with intentionality, a semantic property.  So, by analogy, the
>program will need the aid of something that already has semantic
>properties.

No: just as none of elements involved in producing something
crumbly need themselves be crumbly, none of the elements that go into
producing something with intentionality need themselves have
intentionality (the part/whole fallacy comes to mind here).

In any case, the argument doesn't depend on the analogy being
perfect.  The point is just that while programs (like recipes) may
be syntactic, implementations of programs (like cakes) are not --
they're full-blooded causal systems.  While syntax may not be sufficient
for semantics, it's far less obvious that this kind of causation is
insufficient.

>Also, how much of your relpy hinges on my use of "employs"?
>Suppose I'd said "implements" instead?  I thought the strong
>AI claim was that a person (whether in Chinese Room or not)
>has a mind as a consequence of implementing the right program.

Actually that's not necessary for the core of strong AI, at least
as I and many others would want to defend it.  The claim is that
implementing a program will lead to a mind, not that implementing
a program is the only way to produce a mind.

We further have to distinguish between two ways of using the
word "implement".  In one sense, the interpreter of a program is
implementing it; in another sense, the entire system (interpreter
plus registers plus whatever) is implementing it (it's best here to
think of the relation "implements" as equivalent to "is an implementation
of").  I've been concerned throughout with the second.  So in the Chinese
Room, the person is not implementing the program in this sense; they're
simply acting as an instrument of the implementation.

>I think my remark is clearer in context, thus:
>
>   Finally, note that Searle says "having a mind" while you say
>   "produce a cake".  The mind would be an additional property
>   the computer would gain by instantiating the program, not
>   an external object.  A person following a cake recipe doesn't
>   become crumbly; but if computational theories of mind are
>   correct, something that employs the right program does become
>   "minded".

The same ambiguity in the term "employ" is being exploited here.

-- 
Dave Chalmers                            (dave@cogsci.indiana.edu)      
Center for Research on Concepts and Cognition, Indiana University.
"It is not the least charm of a theory that it is refutable."


