From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!aunro!ukma!wupost!uunet!mcsun!uknet!edcastle!aiai!jeff Thu Jan  9 10:34:10 EST 1992
Article 2562 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!aunro!ukma!wupost!uunet!mcsun!uknet!edcastle!aiai!jeff
>From: jeff@aiai.ed.ac.uk (Jeff Dalton)
Newsgroups: comp.ai.philosophy
Subject: Re: Searle and the Chinese Room
Message-ID: <5909@skye.ed.ac.uk>
Date: 8 Jan 92 20:45:45 GMT
References: <1991Dec5.191043.10565@psych.toronto.edu> <1991Dec5.225949.2613@bronze.ucs.indiana.edu> <5815@skye.ed.ac.uk> <1991Dec12.193222.27298@bronze.ucs.indiana.edu>
Reply-To: jeff@aiai.UUCP (Jeff Dalton)
Organization: AIAI, University of Edinburgh, Scotland
Lines: 118

In article <1991Dec12.193222.27298@bronze.ucs.indiana.edu> chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:
>In article <5815@skye.ed.ac.uk> jeff@aiai.UUCP (Jeff Dalton) writes:
>>In article <1991Dec5.225949.2613@bronze.ucs.indiana.edu> chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:
>
>>>(1) Recipes are completely syntactic.
>>>(2) Cakes are crumbly.
>>>(3) Syntax is not sufficient for crumbliness.
>>>(4) Therefore implementing the appropriate recipe cannot be sufficient
>>>    to produce a cake.
>>>
>>>Reflection on why this argument is fallacious should lead one to
>>>uncover the fallacy in Searle's analogous argument.
>>
>>It doesn't look enough like Searle's argument to me.  
>
>From Searle, "Minds and Brains without Programs", in (Blakemore/Greenfield,
>eds.) _Mindwaves_, p. 231.  (I've changed the order of the axioms, but
>that's all.)
>
>(1) Programs are defined purely formally, or syntactically.
>(2) Minds have mental contents; specifically, they have semantic contents.
>(3) Syntax is not sufficient for semantics.
>(4) Therefore instantiating a program is never sufficient by itself for
>    having a mind.

I said "it doesn't look enough like Searle's argument to me".  But
I didn't say in what respects it was insufficiently similar.  I had
assumed that you were presenting a close textual parallel to one of
Searle's presentations of one of his arguments.  But people who have
some understanding of Searle's argument aren't so closely tied to the
text that they think that whether it stands or falls depends entirely
on the details of its presentation.

Still, you have made a number of textual changes.  For instance,
where Searle has "sufficient by itself", you have only "sufficient".
You might try to claim that "sufficient by itself" is redundant,
but it at least helps make his meaning clearer.  Syntax isn't
sufficient alone, but it might be enough when combined with
other things.  

We all know that we can follow a recipe to produce a cake.
But it's hard to see how that's the same as instantiating
(another word you've changed) the recipe, much less that it's
the same as relying only on this instantiation to produce the
cake.  We at least have to use the right materials, and it's
the physical properties of the materials that ensure we get
a cake in the end rather than a bowl of sludge.

This doesn't seem that far from Searle's conclusion that at
least some of the physical properties of the brain (the famed
"causal powers") as necessary as well as any functional
element, captured as syntax, that might be involved.

Nor is it clear that a recipe is treated as purely syntactic.
It's necessary to know what words like "egg" and "flour" mean.
They can't be treated as variables standing for arbitrary
ingredients, as the analogy with a program would suggest, if
what you want in the end is a cake.  In effect, there are
two parts to a recipe: instructions for manipulating ingredients
(this is the program), and an indication of what ingredients
to use.  The indication is useless for persons (computers)
who don't know the semantics, even if they implement (your
word this time) the program.

Finally, note that Searle says "having a mind" while you say
"produce a cake".  The mind would be an additional property
the computer would gain by instantiating the program, not
an external object.  A person following a cake recipe doesn't
become crumbly; but if computational theories of mind are
correct, something that employs the right program does become
"minded".

It's things such as these that make me think your version of
the argument isn't close enough to Searle's.  You want me to
conclude that programs can produce minds (presumably in the
entity that implements the program, rather than as external
objects) by means of your analogy with recipes and cakes,
but there are just too many problems with the analogy for
it to be convincing.

In other messages, you said more about this idea of "implementation".
But again the reasoning is not convincing.

In article <> chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:
|To be more precise, we formalize causal structure in a program; this
|program then has the property that any implementation of it will
|possess that causal structure.  By analogy: we formalize properties
|of cakes in a recipe; then any implementation of that recipe will
|possess the relevant properties.  If you don't like talk of
|"formalization" here, that's fine: substitute "specification" instead.
|The substantive point is unaffected.

Now, if we make an analogy between recipes and programs, we
ought to bear in mind that a recipe is a program that instructs
a person (computer) how to construct an object, namely a cake.
Instructions for putting together a toy would be similar,
as would, say, a program that directs a robot arm to assemble,
say, the door of a car.

The object is not an implementation of the program.  When
implemented, the program gives the computer a certain causal
structure, one that causes it to carry out the instructions
encoded in the program.  Recipes don't affect humans that
use them in the same direct way that programs affect computers.
But it's still the human that's implementing the program,
not the cake.

Now, one might argue that a program could cause a mind to be
constructed inside a computer, rather like a program might
cause a computer that controlled a robot to make a cake,
except that the "cake" would be inside the computer.  But
if all it constructs is another program (and what else can
it do if it's still going to be "syntax"?), then we still
have to determine whether this new program results in a
mind.  The analogy with cakes gets us to the constructed
program, but no further.

-- jd


