From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Mon Mar  9 18:33:26 EST 1992
Article 4097 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: Reference (was re: Multiple Personality Disorder and Strong AI)
Message-ID: <1992Feb27.223758.3619@psych.toronto.edu>
Organization: Department of Psychology, University of Toronto
References: <1992Feb25.182526.12698@oracorp.com>
Date: Thu, 27 Feb 1992 22:37:58 GMT

In article <1992Feb25.182526.12698@oracorp.com> daryl@oracorp.com writes:
>Christopher Green writes (in response to Stanley Friesen):
>
>CG:
>
>   1. Brains cause minds. Now, of course, that's really too crude....
>   2. Syntax is not sufficient for semantics....a conceptual truth....
>   3. Computer programs are entirely defined by their formal, or syntactical
>      structure....true by definition [of a computer program]
>   4. Minds have mental contents; specifically, they have semantic contents....
>      just an obvious fact about the way minds work....
>
>  Conclusion 4. For any artefact that we might build which had mental states
>              equivalent to human mental states, the implementation
>              of a computer program would not by itself be sufficient.
>              Rather, the artefact would have to have powers equivalent to 
>              the powers of the human brain.
>
>SF:
>    As I have already stated, I question assumptions 2 and 3.
>
>CG: 
>
>> I can't conceive of what you object to in 3. It doesn't need
>> evidence.  It's utterly analytic. Learning to program, even a little,
>> should convince you.
>
>I think you (like Searle before you) are equivocating on the use of
>the phrase "programs are purely syntactic". A program is certainly a
>syntactic object; it is a formal description, or specification, of a
>class of systems (the "implementations" of the program). Learning to
>program involves (at least in part) learning the syntax of a
>programming language. However, the fact that a program is syntactic
>(as is any formal description) does not mean that the implementations
>of the program are purely syntactic. This kind of reasoning is akin
>to: "Hydrogen is described by the Schrodinger equation. The
>Schrodinger equation is mere marks on a piece of paper. Therefore,
>hydrogen is marks on pieces of paper."

The burden therefore rests upon the AI crowd to explain how an implementation
somehow gains semanticity simply through implementation.  And, to reverse
your example above, Searle uses exactly the same reasoning to conclude that
minds *aren't* merely formal descriptions, but rely, just like hydrogen,
on *what they're made up*.  If *this* is what you mean by the importance
of implementation, then Searle would agree.

- michael




