From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!sdd.hp.com!think.com!paperboy.osf.org!hsdndev!husc-news.harvard.edu!zariski!zeleny Mon Dec 16 11:01:48 EST 1991
Article 2112 of comp.ai.philosophy:
Xref: newshub.ccs.yorku.ca rec.arts.books:11583 sci.philosophy.tech:1408 comp.ai.philosophy:2112
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!sdd.hp.com!think.com!paperboy.osf.org!hsdndev!husc-news.harvard.edu!zariski!zeleny
>From: zeleny@zariski.harvard.edu (Mikhail Zeleny)
Newsgroups: rec.arts.books,sci.philosophy.tech,comp.ai.philosophy
Subject: Re: Zeleny (was Re: Searle
Message-ID: <1991Dec14.020037.6551@husc3.harvard.edu>
Date: 14 Dec 91 07:00:35 GMT
References: <1991Dec8.180409.6324@husc3.harvard.edu> <1991Dec9.161328.16412@cherokee.uswest.com> <4327@brahma.cs.hw.ac.uk>
Organization: Dept. of Math, Harvard Univ.
Lines: 123
Nntp-Posting-Host: zariski.harvard.edu

In article <4327@brahma.cs.hw.ac.uk> 
sfleming@cs.hw.ac.uk writes:

>ken@dakota (Kenny Chaffin) writes:

>>zeleny@zariski.harvard.edu (Mikhail Zeleny) writes:

MZ:
>>>Symbolic representation presupposes semantics.

KC:
>>	Does it?
>>	Surely animal's have symbols in their minds of things, does this also
>>include having semantics? Bee's for instance communicate with one another 
>>through a dance. Do bees attach meaning to the various "words" and symbols in
>>their dance?

I've answered this elsewhere.

SF:
>[First of all, let me appease the RAB crowd with a book reference...]
>K. von Frisch, 1966.  The Dancing Bees.  Methuen, London.
>
>Extract from P.N.Johnson-Laird, 1983.  Mental Models : Toward a
>Cognitive Science
>of Language, Inference and Consciousness. CUP p405:
>
>"There is no doubt that the bee constructs a model of the spatial location
>of the food source and is able to transmit the salient features of this model
>to her fellow workers...The bee uses a symbolic response A'' that corresponds
>to an element in an internal representation A' that corresponds to a state of
>affairs A in the world...Yet bees do not possess a language in which symbols
>refer to the world in the way in which human beings can make reference to it."

I don't see any justification for the last sentence.  Why is it obvious
that bees do not possess a language in which symbols (i.e. the dance
patterns) refer to the world in the way in which human beings can make
reference to it?

SF:
>Model-theoretic semantics leads to the hypothesis that meaning is a composition
>of the meanings of components of a sentence (for example). 

Pace Putnam, Davidsonian model-theoretic semantics is bankrupt; as for
other model-theoretic approaches, compositionality is a mere convention,
which is in any case incompatible with the empirically observable
distinction between rhetorical figures of speech and figures of thought.
It's no accident that Davidson denies metaphorical truth: he is simply
incapable of accounting for it.  On the other hand, in Church's system
compositionality is a mere afterthought, and can be readily dispensed with. 

SF:
>                                                        The significance of
>a sentence transcends the meaning because it can only be established by
>relating the propositional representation (a statement with a truth value) to
>a mental model and to general knowledge.

What the hell is a "mental model"?  I can appreciate associating the
significance of a sentence-token with "general knowledge" (e.g. with
lexicographic information), as well as with the context of its utterance,
especially if it contains demonstrative pronouns, tensed verbs, or other
indexicals; but what can an objectively determinable significance have to
do with subjective mental contents?

SF:
>                                         To understand what the bee
>communicates, effectively you must become a bee.

Perhaps.  Still, I could make a pretty good guess.  How can I understand
what you communicate without becoming you?

SF:
>[I seem to remember from far off an Oriental legend called _The Dream of
>the Ants_]
>
>We have, conceivably, ways of "becoming bees".  The "mental structure"
>of a bee is
>perhaps one million neurons.

Sorry to burst your bubble, but neurons belong to the somatic structure,
which, pending the resolution of the mind-body problem, should never be
identified with the mental.

SF:
>                             If we constructed a working model of a
>bee, complete
>with telemetry capability, and caused it to enter a hive and carry out
>the dance,
>would we then be "communicating" with the other bees ?  How would this
>hypothesis
>be tested ?  What if we "communicate" the wrong information ?  Can bees
>lie ?

Surely *you* would be communicating to the same extent that the bees do;
the really interesting question is whether your bee simulacrum would be
communicating as well. 

SF:
>If there is only one symbol, is there a need for semantics ?  Or does the need
>for semantics arise from the need to differentiate between more than one
>symbol ?

Semantics is needed whenever anything at all is denoted and/or expressed.

>STF

>>KAC

>--
>sfleming@cs.hw.ac.uk                        ...ukc!cs.hw.ac.uk!sfleming


`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'
: Qu'est-ce qui est bien?  Qu'est-ce qui est laid?         Harvard   :
: Qu'est-ce qui est grand, fort, faible...                 doesn't   :
: Connais pas! Connais pas!                                 think    :
:                                                             so     :
: Mikhail Zeleny                                                     :
: 872 Massachusetts Ave., Apt. 707                                   :
: Cambridge, Massachusetts 02139           (617) 661-8151            :
: email zeleny@zariski.harvard.edu or zeleny@HUMA1.BITNET            :
:                                                                    :
'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`


