From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Mon Dec 16 11:01:35 EST 1991
Article 2088 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: The Calculus Room
Message-ID: <1991Dec13.043511.19575@psych.toronto.edu>
Organization: Department of Psychology, University of Toronto
References: <1991Dec5.191043.10565@psych.toronto.edu> <44801@mimsy.umd.edu> <1991Dec12.195125.14719@mp.cs.niu.edu>
Date: Fri, 13 Dec 1991 04:35:11 GMT

In article <1991Dec12.195125.14719@mp.cs.niu.edu> rickert@mp.cs.niu.edu (Neil Rickert) writes:

["Calculus Room" analogy deleted]

> Surely the Chinese room suffers the same failing.  The computer is said to
>pass the Turing Test because it can do as well as humans who clearly do not
>understand what they are doing in their inferior test which only requires
>table lookups.  It is then concluded that the computer obviously must not
>understand either.

"Table lookups" are not relevant to the argument.  No specific architecture
or programming technique is at issue.  Write *any* program you want --
Searle would still claim it has no understanding.

- michael




