From newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rutgers!gatech!ncar!noao!amethyst!organpipe.uug.arizona.edu!organpipe.uug.arizona.edu!bill Tue Jun  9 10:08:04 EDT 1992
Article 6169 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!torn.onet.on.ca!utgpu!news-server.csri.toronto.edu!rutgers!gatech!ncar!noao!amethyst!organpipe.uug.arizona.edu!organpipe.uug.arizona.edu!bill
>From: bill@nsma.arizona.edu (Bill Skaggs)
Newsgroups: comp.ai.philosophy
Subject: Re: Transducers
Message-ID: <BILL.92Jun8213911@ca3.nsma.arizona.edu>
Date: 9 Jun 92 04:39:11 GMT
References: <BILL.92Jun7131519@ca3.nsma.arizona.edu> <60790@aurs01.UUCP>
	<BILL.92Jun8150837@cortex.nsma.arizona.edu>
	<1992Jun8.221324.535@mp.cs.niu.edu>
Sender: news@organpipe.uug.arizona.edu
Organization: ARL Division of Neural Systems, Memory and Aging, University of
	Arizona
Lines: 22
In-Reply-To: rickert@mp.cs.niu.edu's message of 8 Jun 92 22: 13:24 GMT

rickert@mp.cs.niu.edu (Neil Rickert) writes:

    Perhaps.  But perhaps an argument "in principle" means that once a
   successful embodied robot has been created, a disembodied version can
   then be created *in practice* merely by copying the disk containing all
   of the learned knowledge about the real world.

Good point; I accept it.  But note that the disembodied version, when
participating in a Turing Test, would suffer all the difficulties of a
disembodied human.  A question such as, "Scrunch up the palm of your
hand, and describe the folds that you see," would cause it great
difficulty, unless it were connected to an impossibly detailed
simulation of the real world.  Should we avoid this problem by placing
the human participant in a sensory isolation tank?  But even this
might not be enough . . .

I think that if this line of reasoning is followed out, the inevitable
result is that a machine Turing-equivalent to a human must either be
far more intelligent than any human or else must come pretty close to
Total-Turing equivalence.

	-- Bill


