Newsgroups: comp.ai,comp.ai.philosophy,alt.consciousness,comp.ai.alife
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!bloom-beacon.mit.edu!gatech!udel!rochester!galileo.cc.rochester.edu!prodigal.psych.rochester.edu!stevens
From: stevens@prodigal.psych.rochester.edu (Greg Stevens)
Subject: Re: Computers--Next stage in evolution? Hmmmmmm.....
Message-ID: <1995Mar26.184723.13417@galileo.cc.rochester.edu>
Sender: news@galileo.cc.rochester.edu
Nntp-Posting-Host: prodigal.psych.rochester.edu
Organization: University of Rochester - Rochester, New York
References: <3jgqon$gke@usenet.INS.CWRU.Edu> <3jkd1c$d21@unogate.unocal.com> <3jlpjg$ak4@laplace.ee.latrobe.edu.au> <3jmi3b$ljc@agate.berkeley.edu> <D5t31C.86A@topazio.dcc.ufmg.br> <XsXSlmJEWTq5073yn@iaccess.za> <mws.4.00A71980@pond.com> <mws.5.00280780@pond.com>
Date: Sun, 26 Mar 95 18:47:23 GMT
Lines: 74
Xref: glinda.oz.cs.cmu.edu comp.ai:28493 comp.ai.philosophy:26282 comp.ai.alife:2841

In <mws.5.00280780@pond.com> mws@pond.com (Fred Mitchell) writes:
>In article <1995Mar24.175801.6771@galileo.cc.rochester.edu> stevens@prodigal.psych.rochester.edu (Greg Stevens) writes:
>>From: stevens@prodigal.psych.rochester.edu (Greg Stevens)
>>Subject: Re: Computers--Next stage in evolution? Hmmmmmm.....
>>Date: Fri, 24 Mar 95 17:58:01 GMT
>>In <mws.4.00A71980@pond.com> mws@pond.com (Fred Mitchell) writes:
>>>In article <XsXSlmJEWTq5073yn@iaccess.za> spike@iaccess.za (Mark Stockton)
>>writes:

>>>>I would consider language as a sequential programming environment running
>>>>on a multiprocessor.

>>>More importantly,language (human) is a handful of symbols that in themselves 
>>>represent little information, but are linked to vast, vast resivors of 
>>>information in our minds....

>>Well, it is only remarkable if you are looking at it from the perspective
>>of the fallacy that language actually communicates information -- that is,
>>that language allows us to encode something in our mental state, transmit
>>it, and get the hearer to decode it so that there is correspondance between
>>our two mental states.

>But that is exactly what happens, no?

I don't think it is, no.

>>Part of the reason we get so much meaning out of symbols is because of purely
>>internal reconstruction.  We construct meaning based on our associations
>>quite independant of the intent or associations of the speaker.  This is
>>not information being "transmitted."  

>Sure it is. The whole ideal of "encoding" is to use a set of symbols that we 
>both mutually agree as to their meaning. We substitute entire blocks of 
>thought for a sequence of symbols. In essence, we are "encoding" our thougts.

But our "encoding" mechanism is based purely on our experience, and other
people's "decoding" mechanism on theirs  -- why is there any reason to
assume that the thoughts prompted in the listener are the same as those
initiated in the speaker?  The only requirment for FUNCTIONALLY testing
language effectiveness is behavioral -- there is no functional way of
testing how "accurately" the listener's mental states match the speaker's.

If person A thinks "love" is exchanging flowers every night, and person B
thinks "love" is having sex every night, then person A and person B may
have sex and exchange flowers every night, and when on says to the other
"I love you" they both agree.  Functionally, they communicate.  Is there
transmission of mental states going on?  Can you really say that this is
a case of encoding/decoding mental states with language?

>>It is a signal which triggers
>>associations within ourselves.  When someone says the word "specious" I
>>think of many things, including "religion" and "painted wooden rocking horse
>>from the play 'Inherit the Wind'" but neither of these are probably what is
>>being "transmitted."

>Sounds like we are just arguing over semantics. We must agree on the meaning 
>of these symbols, else you'll never know what I meant to say. 

Why do you think I ever do know what you MEANT to say?

>The act of 
>"reconstruction" is the act of decoding the sequence of symbols. Because we 
>don't have _exact_ sets of associations (as is demonstrated by this very 
>debate), we sometimes decode incorrectly. But after an interchange of such 
>symbols, we eventually get the intended message across.

Why do you believe this?  Your measurement of effectiveness is only through
functinal behavior.  You are not measuring the consistency of mental
states.

Greg Stevens

stevens@prodigal.psych.rochester.edu

