Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!news.sprintlink.net!EU.net!uknet!festival!edcogsci!jeff
From: jeff@aiai.ed.ac.uk (Jeff Dalton)
Subject: Re: Turing's Playful Games
Message-ID: <D5qyDH.MwH@cogsci.ed.ac.uk>
Sender: usenet@cogsci.ed.ac.uk (C News Software)
Nntp-Posting-Host: bute.aiai.ed.ac.uk
Organization: AIAI, University of Edinburgh, Scotland
References: <3kcqcr$3td@mp.cs.niu.edu> <3kdeib$f03@ixnews2.ix.netcom.com> <3kepmp$nfd@mp.cs.niu.edu>
Date: Mon, 20 Mar 1995 16:12:53 GMT
Lines: 25

In article <3kepmp$nfd@mp.cs.niu.edu> rickert@cs.niu.edu (Neil Rickert) writes:
>In <3kdeib$f03@ixnews2.ix.netcom.com> Aftrglow@ix.netcom.com (Tom Hunscher) writes:
>
>>In <3kcqcr$3td@mp.cs.niu.edu> rickert@cs.niu.edu (Neil Rickert) writes: 
>
>>>>That's my point. I DO know what I am. A Turing machine doesn't have 
>>to, 
>>>>does it. In fact, it doesn't have to know anything except how to 
>>respond 
>>>>like a person (which means, unlike what it really is).
>
>>>If a Turing machine doesn't know, then it flunks the Turing Test.  So
>>>your argument does not demonstrate that the test is bogus.
>
>>I'm not following. The Turing Test is a test of the ability of a machine 
>>to dissemble that it is human, is it not? 
>
>The robot is confused, and thinks it is human.  The discussion turns
>to the texture of human skin.  Thinking it is human, the robot
>describes its own metallic skin.  The robot promptly flunks the TT.

That's not the only thing the robot could do.  For instance, it
might describe human skin.  Perhaps it was designed to pass the TT,
and was modified until it answered questions of this sort "correctly".

