From newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rpi!usc!sdd.hp.com!uakari.primate.wisc.edu!zaphod.mps.ohio-state.edu!pacific.mps.ohio-state.edu!linac!mp.cs.niu.edu!rickert Mon Jun 15 16:04:22 EDT 1992
Article 6176 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rpi!usc!sdd.hp.com!uakari.primate.wisc.edu!zaphod.mps.ohio-state.edu!pacific.mps.ohio-state.edu!linac!mp.cs.niu.edu!rickert
>From: rickert@mp.cs.niu.edu (Neil Rickert)
Newsgroups: comp.ai.philosophy
Subject: Re: Transducers
Message-ID: <1992Jun9.153213.2220@mp.cs.niu.edu>
Date: 9 Jun 92 15:32:13 GMT
References: <BILL.92Jun8150837@cortex.nsma.arizona.edu> <1992Jun8.221324.535@mp.cs.niu.edu> <BILL.92Jun8213911@ca3.nsma.arizona.edu>
Organization: Northern Illinois University
Lines: 20

In article <BILL.92Jun8213911@ca3.nsma.arizona.edu> bill@nsma.arizona.edu (Bill Skaggs) writes:
>rickert@mp.cs.niu.edu (Neil Rickert) writes:
>
>    Perhaps.  But perhaps an argument "in principle" means that once a
>   successful embodied robot has been created, a disembodied version can
>   then be created *in practice* merely by copying the disk containing all
>   of the learned knowledge about the real world.
>
>Good point; I accept it.  But note that the disembodied version, when
>participating in a Turing Test, would suffer all the difficulties of a
>disembodied human.

  Agreed.  Whether this would be useful would depend on the intended
purpose of the system.  If the system were intended mostly for abstract
thought about mathematics (or about angels dancing on the head of a pin
for that matter), being disembodied might not be a serious handicap.  But
for other issues, it might be a serious deficiency.

 [Footnote:  What is all this agreement.  Are we breaking the rules of this
newsgroup by actually agreeing on something?]


