From newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!cs.utexas.edu!uunet!mcsun!uknet!edcastle!edcogsci!dlh Mon Jun 15 16:04:27 EDT 1992
Article 6184 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!cs.utexas.edu!uunet!mcsun!uknet!edcastle!edcogsci!dlh
>From: dlh@cogsci.ed.ac.uk (Dominik Lukes)
Newsgroups: comp.ai.philosophy
Subject: Re: Transducers
Message-ID: <9721@scott.ed.ac.uk>
Date: 9 Jun 92 18:22:51 GMT
References: <BILL.92Jun7131519@ca3.nsma.arizona.edu> <1992Jun7.210138.21887@cs.ucf.edu> <9706@scott.ed.ac.uk> <1992Jun9.061532.10440@cs.ucf.edu>
Distribution: world,local
Organization: Centre for Cognitive Science, Edinburgh, UK
Lines: 54

In article <1992Jun9.061532.10440@cs.ucf.edu> gomez@barros.cs.ucf.edu (Fernando Gomez) writes:
>In article <9706@scott.ed.ac.uk> dlh@cogsci.ed.ac.uk (Dominik Lukes) writes:
>>In article <1992Jun7.210138.21887@cs.ucf.edu> gomez@barros.cs.ucf.edu (Fernando Gomez) writes:
>>>and others.
>>>But, Brooks' insects will not be able to conduct a conversation about
>>>eating (even if they knew how to eat) because they have no way to verbalize
>>>or conceptualize their stimuli. They would need some kind of a priori
>>>representation (ungrounded concepts) to frame their stimuli.
>>>
>>>Fernando Gomez
>>
>>
>>
>>Nor do real insects or even dogs, but people somewhat got fed up and
>>spoke aloud about all the misery, rulling world, especially when
>>waiter brings you yesterdays fish as todays beefsteak. But, please,
>>note that computers are unable *verbalise* anything either, they just
>>operate with strings of charecters, which although not absolutely
>>meaningless, as Searle claims, do not have the same meaning as they do
>>for humans or Brooks' fancy insects.
>
>"Same Meaning!" - That is a heavy one, a lot worse than "grounding."
>Certainly,  program  concepts do not have the same meaning as they do
>have for humans. But, human concepts do not have the same meaning across
>humans either, because our knowledge and personal cognitive histories are so
>different! 
Now you hit the point, the whole nations got extinct because of this,
so don't say it in this ironic tone. BUT, people are still able to
find some level where they achieve their, so desired
for,"understanding", and pass apples to each other. 

>Meanings are not apples (the denotations of model semantics)
>that an intelocutor throw to the other one to catch. So, in this regard
>humans are not better off than computers. Meaning is as elusive to
>computers as to us.
> 
But for computers understanding is just matching of strings, there is
no redundancy, "centered categorization", embodiment, the meaning is
purely syntactic, all is precise, but they're too slow yet to handle
all  what humans, in possession of all the  about mentioned, "can", by
they "abstract symbols' manipulation", if it is possible at all, which
is still an open issue, so, please don't yell at me that it is  bogus or
something you don't like(do you like spinach?). 
It was a long sentence, wasn't it?  

Yours sincere & faithfull,
Dominik.
======================================
    My spELling iS wobbly.              
It's goOd spelling bUt it wobbles       
   and tHe letters get iN               
      the wrOng plaCes.                 
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
					


