From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Mon Dec 16 11:01:35 EST 1991
Article 2089 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: The Martian Room
Message-ID: <1991Dec13.043758.19880@psych.toronto.edu>
Organization: Department of Psychology, University of Toronto
References: <1991Dec12.195125.14719@mp.cs.niu.edu> <1991Dec12.195341.16163@mp.cs.niu.edu>
Date: Fri, 13 Dec 1991 04:37:58 GMT

In article <1991Dec12.195341.16163@mp.cs.niu.edu> rickert@mp.cs.niu.edu (Neil Rickert) writes:
>
> Here we conduct a two part experiment in the Martian Room.
>
> Just as the occupants of the Chinese Room did not understand a word of 
>Chinese, so also the occupants of the Martian Room do not understand a word of 
>Martian.  It does so happen, though, that they are very fluent in Chinese.
>
> PART I.
>
> We now give the Martian Room occupants the chinese rule book, and provide 
>them with the same input as in the original Chinese Room experiment.  Nobody 
>can doubt that they really do understand what they are doing.
>
> But, as they perform their tasks, a strange thing happens.  We find that they 
>are often paraphrasing the answers.  They are following the spirit of the rule 
>book, but not the letter of the rule book.
>
> We now compare with the computer.  Indeed the computer follows the letter of 
>the rule book.  Clearly the computer flunks the Turing test.
>
> PART II.
>
> We take that broken computer which only uses a rule-based expert system
>approach, and dump it in the trash can in Searle's office.  We replace it
>with the newest model which is fluent in Chinese.  Lo and behold, we
>discover it often paraphrases the answers, and follows the spirit of the
>rule book but not the letter.
>
> Please explain how this test proves the computer is not intelligent!

Neil, I simply do not understand your point, or for that matter your example.
What does it mean to get a computer "fluent in Chinese"?  How could such
a program not follow "the letter of the rule book", i.e., its  
program?!  There is some *very* weird shit going on here...

- michael



