From newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!mips!darwin.sura.net!cs.ucf.edu!barros!gomez Tue Jun  9 10:08:06 EDT 1992
Article 6171 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!mips!darwin.sura.net!cs.ucf.edu!barros!gomez
>From: gomez@barros.cs.ucf.edu (Fernando Gomez)
Subject: Re: Transducers
Message-ID: <1992Jun9.061532.10440@cs.ucf.edu>
Sender: news@cs.ucf.edu (News system)
Organization: University of Central Florida, Orlando
References: <BILL.92Jun7131519@ca3.nsma.arizona.edu> <1992Jun7.210138.21887@cs.ucf.edu> <9706@scott.ed.ac.uk>
Distribution: world,local
Date: Tue, 9 Jun 1992 06:15:32 GMT

In article <9706@scott.ed.ac.uk> dlh@cogsci.ed.ac.uk (Dominik Lukes) writes:
>In article <1992Jun7.210138.21887@cs.ucf.edu> gomez@barros.cs.ucf.edu (Fernando Gomez) writes:
>>and others.
>>But, Brooks' insects will not be able to conduct a conversation about
>>eating (even if they knew how to eat) because they have no way to verbalize
>>or conceptualize their stimuli. They would need some kind of a priori
>>representation (ungrounded concepts) to frame their stimuli.
>>
>>Fernando Gomez
>
>
>
>Nor do real insects or even dogs, but people somewhat got fed up and
>spoke aloud about all the misery, rulling world, especially when
>waiter brings you yesterdays fish as todays beefsteak. But, please,
>note that computers are unable *verbalise* anything either, they just
>operate with strings of charecters, which although not absolutely
>meaningless, as Searle claims, do not have the same meaning as they do
>for humans or Brooks' fancy insects.

"Same Meaning!" - That is a heavy one, a lot worse than "grounding."
Certainly,  program  concepts do not have the same meaning as they do
have for humans. But, human concepts do not have the same meaning across
humans either, because our knowledge and personal cognitive histories are so
different! Meanings are not apples (the denotations of model semantics)
that an intelocutor throw to the other one to catch. So, in this regard
humans are not better off than computers. Meaning is as elusive to
computers as to us.
 
Fernando Gomez
-------------


