From newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!mips!darwin.sura.net!cs.ucf.edu!barros!gomez Tue Jun  9 10:08:05 EDT 1992
Article 6170 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!mips!darwin.sura.net!cs.ucf.edu!barros!gomez
>From: gomez@barros.cs.ucf.edu (Fernando Gomez)
Subject: Re: Transducers
Message-ID: <1992Jun9.051649.9894@cs.ucf.edu>
Sender: news@cs.ucf.edu (News system)
Organization: University of Central Florida, Orlando
References: <60790@aurs01.UUCP> <BILL.92Jun8150837@cortex.nsma.arizona.edu> <1992Jun8.221324.535@mp.cs.niu.edu>
Date: Tue, 9 Jun 1992 05:16:49 GMT

In article <1992Jun8.221324.535@mp.cs.niu.edu> rickert@mp.cs.niu.edu (Neil Rickert) writes:
>In article <BILL.92Jun8150837@cortex.nsma.arizona.edu> bill@nsma.arizona.edu (Bill Skaggs) writes:
>[Discussion of the need of "embodiment" of a robot]
>>throop@aurs01.UUCP (Wayne Throop) writes:
>
>>   But surely this is talking about *engineering* or design problems,
>>   not any problems of TT-passing-without-embodiedness in *principle*.
>
>>In any case, I can't get too excited about "in principle" arguments.
>>As far as I can see, saying something is true in principle means
>>saying it would be true if you could change the inconvenient part of
>>reality.
>
> Perhaps.  But perhaps an argument "in principle" means that once a
>successful embodied robot has been created, a disembodied version can
>then be created *in practice* merely by copying the disk containing all
>of the learned knowledge about the real world.
>

Not so. Because the emboddied robot relies upon its embodied functions
to do problem solving, and this aspect is not part of the
learned knowledge. To be specific, consider my example about coducting
a conversation about 
an embodied eating-robot and a disembodied robot, and the question:
Does the teeth get busier in eating an apple than in eating a jello?
If the eating-robot knows nothing about jello, still it
can find out the answer to this question by eating a jello. This knowledge
is learned and can be passed to the disembodied robot. But, what if
we replace in the question "jello" with "ice cream" would the disembodied
robot know what to answer? The answer is "yes" only if the eating-robot
has eaten ice cream and learned this knowledge. But, the eating-robot
can not be exposed to a potentially infinite number of sustances, hence
there will be always something that the disembodied robot will fail to
answer/solve.

Fernando Gomez


