From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!cs.utexas.edu!uunet!psinntp!scylla!daryl Mon May 25 14:06:30 EDT 1992
Article 5772 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!cs.utexas.edu!uunet!psinntp!scylla!daryl
>From: daryl@oracorp.com (Daryl McCullough)
Subject: Re: The Systems Reply I
Message-ID: <1992May19.223413.8059@oracorp.com>
Organization: ORA Corporation
Date: Tue, 19 May 1992 22:34:13 GMT
Lines: 75

jeff@aiai.ed.ac.uk (Jeff Dalton) writes:

>You misunderstand me.  I quoted the sentence from Daryl in which "it"
>occurred twice.  Here it is again:
>
>   If you wish to show that computers lack something that humans
>   possess, it seems to me that you need to show (a) that computers
>   lack it, and (b) that humans possess it. If you only prove (a) then
>   you have not proved your point.
>
>The "it" does not refer to both instances of anything -- both
>instances of the word "it" refer to the same thing.  Consequently,
>there can be no question of equivocation here.  Now, the "it" in
>question is understanding, as should be clear if you go back to the
>original message. Moreover, it is the same understanding that
>"nobody is denying".

No, Jeff! The "it" specifically does *not* refer to "understanding".
I tried to make that clear by saying that I am *not* questioning
whether humans understand.

The form of the arguments that Searle, Penrose, and others use in
demonstrating that computers are not capable of human understanding is
the following:

    1. Human understanding requires X.
    2. Computers lack X.
    3. Therefore, computers are not capable of human understanding.

Most such arguments are about as fishy as week-old sushi. Usually,
premise 1 is justified using one meaning of X, and premise 2 is
justified using a different meaning of X, and so there is no single X
such that it is clear that humans possess X and it is also clear that
computers lack X.

For example, let X be semantics. It seems like a reasonable enough
claim that human thoughts have semantics. After all, our thoughts and
words by all indications have real-world referents. What about
computers? Well, a computer program is a syntactic specification of
output symbols in terms of input symbols, and it is well known that no
syntactic specification can uniquely determine the interpretation of
the symbols in a language. Therefore, computers lack semantics.

But is the semantics that human thoughts possess the same as the
semantics that computer programs lack? I don't think so. The criterion
for saying that human thought has semantics is much, much weaker than
the criterion demanded of computers. There is no proof of uniqueness
of the interpretations of our words. As a matter of fact, it is always
possible to give more than one interpretation to any collection of
utterances. What we have going for us are two facts: (1) Subjectively,
there seems to be a unique, preferred interpretation for our words.
(2) There is a consistent real-world interpretation others give to our
words.

Surely, there is no denying that it is *possible* to give an
interpretation to the utterances made by a computer. The only
question, then, is whether computers are capable of a subjective sense
of the meaning of their words. There is no information one way or the
other on this subject, and it certainly doesn't follow from the
observation that "syntax is not sufficient for semantics". We are not
talking about "possessing semantics" in the sense of there being a
unique interpretation of symbols, but about some kind of internal,
subjective sense of meaning.

To reiterate, there is no X such that it is clear that (a) humans
possess X, (b) computers lack X, and (c) X is relevant to
understanding.

Daryl McCullough
ORA Corp.
Ithaca, NY






