From newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rpi!usc!cs.utexas.edu!swrinde!mips!darwin.sura.net!cs.ucf.edu!barros!gomez Tue Jun  9 10:07:43 EDT 1992
Article 6141 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rpi!usc!cs.utexas.edu!swrinde!mips!darwin.sura.net!cs.ucf.edu!barros!gomez
>From: gomez@barros.cs.ucf.edu (Fernando Gomez)
Newsgroups: comp.ai.philosophy
Subject: Re: Transducers
Message-ID: <1992Jun7.210138.21887@cs.ucf.edu>
Date: 7 Jun 92 21:01:38 GMT
References: <BILL.92Jun6194350@ca3.nsma.arizona.edu> <1992Jun7.034525.16059@cs.ucf.edu> <BILL.92Jun7131519@ca3.nsma.arizona.edu>
Sender: news@cs.ucf.edu (News system)
Distribution: world,local
Organization: University of Central Florida, Orlando
Lines: 41

In article <BILL.92Jun7131519@ca3.nsma.arizona.edu> bill@nsma.arizona.edu (Bill Skaggs) writes:
>In article <1992Jun7.034525.16059@cs.ucf.edu> 
>gomez@barros.cs.ucf.edu (Fernando Gomez) writes:
>
>   Bill Skaggs writes:
>
>   "... If the machine is designed to be a conversation partner, then TT
>   capability is sufficient; if it is designed as, say, a gymnastics
>   instructor, some degree of TTT capablility is necessary ..." (End of Quote)
>
>
>   It may not be sufficient. It depends on the subject matter of the
>   conversation. Suppose that one builds a robot that eats just like we
>   do. Will not be that robot in better position to conduct a conversation
>   with us about eating than a program/robot that everything
>   it knows about eating is by being told?  [ . . . ]
>
>Well, passing the Turing Test requires conversation indistinguishable
>from that of a human, so this is a moot question.  But from a
>practical point of view, it may very well be impossible to build a
>system that can imitate human language without having humanlike
>sensorimotor capabilities.
>
>Rodney Brooks, who builds fantastic insect-like robots at MIT, claims
>that one of the greatest hindrances to AI has been the lack of
>"embodiment" of the systems that people have built; he argues that
>this leads to unrealistic conceptions of the nature of the problems
>that need to be solved.  (There is a nice tech report, in PostScript
>format, available for anonymous ftp from ftp.ai.mit.edu; the file is
>/pub/doc/brooks-ijcai91.ps.Z.) 
>
>	-- Bill

I am familiriazed with Brooks,  Rosenschein's work, Agre's work,
and others.
But, Brooks' insects will not be able to conduct a conversation about
eating (even if they knew how to eat) because they have no way to verbalize
or conceptualize their stimuli. They would need some kind of a priori
representation (ungrounded concepts) to frame their stimuli.

Fernando Gomez


