Newsgroups: sci.skeptic,alt.consciousness,comp.ai.philosophy,sci.philosophy.meta,rec.arts.books
Path: cantaloupe.srv.cs.cmu.edu!nntp.club.cc.cmu.edu!miner.usbm.gov!rsg1.er.usgs.gov!jobone!newsxfer.itd.umich.edu!zip.eecs.umich.edu!newshost.marcam.com!news.mathworks.com!news.alpha.net!uwm.edu!math.ohio-state.edu!howland.reston.ans.net!pipex!sunsite.doc.ic.ac.uk!uknet!festival!edcogsci!jeff
From: jeff@aiai.ed.ac.uk (Jeff Dalton)
Subject: Re: Penrose and Searle (was Re: Roger Penrose's fixed ideas)
Message-ID: <CzFsqK.A1s@cogsci.ed.ac.uk>
Sender: usenet@cogsci.ed.ac.uk (C News Software)
Nntp-Posting-Host: bute-alter.aiai.ed.ac.uk
Organization: AIAI, University of Edinburgh, Scotland
References: <388o02$gla@ixnews1.ix.netcom.com> <39oqc8$9gb@news-rocq.inria.fr> <1994Nov9.160430@west.cscwc.pima.edu>
Distribution: inet
Date: Fri, 18 Nov 1994 00:15:55 GMT
Lines: 19
Xref: glinda.oz.cs.cmu.edu sci.skeptic:95538 comp.ai.philosophy:22192 sci.philosophy.meta:14828

In article <1994Nov9.160430@west.cscwc.pima.edu> 103t_english@west.cscwc.pima.edu writes:

>> By the way, does anybody think that the Chinese room example could be improved,
>> not to prove that a machine cannot be intelligent of course, but to clearly
>> point out limitations of a purely behaviorist definition of AI like
>> Turing's.
>> 
>> Mikal
>
>
>How could it be? One could then redefine Turing's test to include "behavioral"
>stuff that required something more than a computer terminal to demonstrate.

Sure, but then it would be a different test.  A number of different
tests are clearly possible (consider, e.g., the various ones in _Do
Androids Dream of Electric Sheep_).  But that shouldn't stop us
from drawing conclusions about one particular test, namely the
Turing Test.

