From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!qt.cs.utexas.edu!zaphod.mps.ohio-state.edu!uakari.primate.wisc.edu!ames!haven.umd.edu!darwin.sura.net!Sirius.dfn.de!fauern!unido!mcsun!fuug!news.funet.fi!sunic! Mon Mar  9 18:33:04 EST 1992
Article 4060 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!qt.cs.utexas.edu!zaphod.mps.ohio-state.edu!uakari.primate.wisc.edu!ames!haven.umd.edu!darwin.sura.net!Sirius.dfn.de!fauern!unido!mcsun!fuug!news.funet.fi!sunic!
dkuug!diku!kurt
>From: kurt@diku.dk (Kurt M. Alonso)
Newsgroups: comp.ai.philosophy
Subject: Re: Reference (was re: Multiple Personality Disorder and Strong AI)
Message-ID: <1992Feb26.133639.5838@odin.diku.dk>
Date: 26 Feb 92 13:36:39 GMT
References: <1992Feb25.182526.12698@oracorp.com>
Sender: kurt@rimfaxe.diku.dk
Organization: Department of Computer Science, U of Copenhagen
Lines: 103

daryl@oracorp.com writes:

>Christopher Green writes (in response to Stanley Friesen):

>CG:

>   1. Brains cause minds. Now, of course, that's really too crude....
>   2. Syntax is not sufficient for semantics....a conceptual truth....
>   3. Computer programs are entirely defined by their formal, or syntactical
>      structure....true by definition [of a computer program]
>   4. Minds have mental contents; specifically, they have semantic contents....
>      just an obvious fact about the way minds work....

>  Conclusion 4. For any artefact that we might build which had mental states
>              equivalent to human mental states, the implementation
>              of a computer program would not by itself be sufficient.
>              Rather, the artefact would have to have powers equivalent to 
>              the powers of the human brain.

>SF:
>    As I have already stated, I question assumptions 2 and 3.

>CG: 

>> I can't conceive of what you object to in 3. It doesn't need
>> evidence.  It's utterly analytic. Learning to program, even a little,
>> should convince you.

>I think you (like Searle before you) are equivocating on the use of
>the phrase "programs are purely syntactic". A program is certainly a
>syntactic object; it is a formal description, or specification, of a
>class of systems (the "implementations" of the program). Learning to
>program involves (at least in part) learning the syntax of a
>programming language. However, the fact that a program is syntactic
>(as is any formal description) does not mean that the implementations
>of the program are purely syntactic. This kind of reasoning is akin
>to: "Hydrogen is described by the Schrodinger equation. The
>Schrodinger equation is mere marks on a piece of paper. Therefore,
>hydrogen is marks on pieces of paper."

A bad syllogism! That Hydrogen may be described by means of a formal
system does not mean that Hydrogen IS [part of] that formal system.
The theory is not but an instrument that 'reveals' this or that
OF Hydrogen. Whatever knowledge is acquired in this fashion is *mediated*
by the theory. There is no possibility that the theory have the semantics
of Hydrogen in itself, for its only possible semantics are *given*
*prior* (in a logical sense) to the theory by the scientist himself.
That is to say, the theory can only remain theory, instrument by which 
certain knowledge is made possible.

The same is true of programs, for they are constructed as instruments,
and the only real semantics (as in the mind of the programmer) that
can be possible are those that are put forward by the user, prior
to the program being used as an instrument. To claim that a program
has semantics in the same sense that the programmer has is equivalent
to saying that the program is no longer an instrument when it is running.
That is, that the program (and perhaps, whatever entity that runs it) is
only 'accidentally' (not 'essentially') an instrument of the user that runs
it. This must, indeed, be in contradiction with whatever you believe a
computer is (I certainly hope so!).

So programs are only syntactic entities!

>Now, let's turn to the other sense in which it is commonly claimed
>that "programs are purely syntactic". *Computers* don't directly
>manipulate real objects; it cannot, for instance, examine a real
>hamburger, it can only examine a syntactic internal representation of
>a hamburger. In this sense, computers are called syntactic because (a)
>they only deal with representations, not with the real things, and (b)
>the representations are discrete and digital. These points are
>certainly true, but they make the claim that "Syntactic means no
>semantics" very dubious. *Human brains* don't directly manipulate the
>real things, either, they only manipulate internal representations, so
>in that sense, they are just as syntactic as computers. The only point
>left of Searle's syllogism--and a point that Steven Harnad seems
>impressed by--is that computers, by dealing with *digital*
>representations can never grasp the analog world. This seems pretty
>dubious reasoning to me; certainly it is not the analytic truth that
>Searle claimed for his point 3.

The same argument:

It simply is a logical impossiblity that a computer acquire the 'real'
semantics, for that would be equivalent to it loose the mediateness that
we have designed it with. It would have to be semantically immediate in its
very mediateness.

To construct a formal system requires that we become conscious, in such a
manner that we differentiate ourselves from the 'real world'; afterwards, 
we can construct the formal system as an instrument *only*, to 'deal' with 
the world we just left, such that it *mediate* our knowledge or our 
manipulations of the world. The only semantics that the formal system can 
have is that of the user of the formalism. To confuse such a formalism with 
'the real thing' (which in the end is what your claims amount to) is simply 
a contradiction in terms!


>Daryl McCullough
>ORA Corp.
>Ithaca, NY


Kurt.


