From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!uwm.edu!linac!att!rutgers!rochester!kodak!ispd-newsserver!psinntp!scylla!daryl Mon Mar  9 18:35:24 EST 1992
Article 4280 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!uwm.edu!linac!att!rutgers!rochester!kodak!ispd-newsserver!psinntp!scylla!daryl
>From: daryl@oracorp.com
Newsgroups: comp.ai.philosophy
Subject: Re: Reference (was re: Multiple Personality Disorder and Strong AI)
Message-ID: <1992Mar5.153056.29003@oracorp.com>
Date: 5 Mar 92 15:30:56 GMT
Organization: ORA Corporation
Lines: 63

shibe@leland.Stanford.EDU (Eric Schaible) writes (quoting me):

>> ...but I don't see why any of the supposed limitations on a
>> computer's ability with semantics do not also apply to human beings.

[stuff deleted]

> I think that your criterion of 'determining' semantics is unreasonably
> strong, and is probably the result of carrying the "computer metaphor"
> of the mind a little too far.

[more stuff deleted]

> I think the following view is more reasonable: Humans do not access
> meanings of words; rather, they interpretively construct meanings in
> an ad hoc, context-dependent way.  When you ask someone to give you a
> definition of a word apart from a context of use, they can give you a
> generalization of the ways in which they use the word; but this is
> merely a generalization--an abstraction from the more fundamental
> process of context-based interpretation.

I agree with everything you say here: meaning is not given by
definitions, but by context. However, it still doesn't help to
illustrate a way that humans understand that is inherently
inaccessible to computers. As a matter of fact, I would say that a
computer program that can pass the Turing Test (it behaves "as if" it
understands, always doing the appropriate thing at the appropriate
time) would already possess the contextual aspect of understanding.

> In contrast, for current computers it is clear that if the
> so-called-semantics are not discrete and formal, then processing is
> impossible. The absurdly complete and well-defined semantic structure
> mentioned above is exactly what is needed for the computer to do the
> job. Moreover, it is needed BECAUSE the system possesses no way of
> understanding the term--since the computer has no understanding, one
> needs to make every possible application explicit.

I disagree; I don't think that computers need an explicit semantics
for words any more than humans do. There is no need to program the
*meaning* of words into a computer, it is only necessary to program it
to respond appropriately (depending on the context) to the occurrence
of words, and to use words appropriately. ("Responding appropriately"
could be internal, rather than external.) I don't believe that humans
do any more than this; I don't see *how* humans could do any more than
this.

> To sum up: for the human, there does not need to be one determinate,
> well-defined meaning; for current computers, there does. Your
> application of the computer metaphor of the brain leads you (I think
> incorrectly) to assume that the computer's need for formalism must
> apply to humans as well. As a result, you conclude (again, I think
> incorrectly) that if a human cannot apply or generate a determinate
> semantic representation, the human cannot gain information from
> processing.

Once again, I agree that the idea of dictionary definitions of words
is wrong for human beings; they don't possess such definitions
(although they may be able to construct them on the spot). However, I
don't think that computers require them any more than humans do.

Daryl McCullough
ORA Corp.
Ithaca, NY


