From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!pacific.mps.ohio-state.edu!linac!uchinews!spssig!markrose Thu Jan  9 10:34:21 EST 1992
Article 2580 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!pacific.mps.ohio-state.edu!linac!uchinews!spssig!markrose
>From: markrose@spss.com (Mark Rosenfelder)
Newsgroups: comp.ai.philosophy
Subject: Re: The Robot Reply (was Re: Searle, again)
Message-ID: <1992Jan08.231644.39424@spss.com>
Date: 8 Jan 92 23:16:44 GMT
References: <5825@skye.ed.ac.uk> <309@tdatirv.UUCP> <5908@skye.ed.ac.uk>
Organization: SPSS, Inc.
Lines: 23
Nntp-Posting-Host: spssrs7.spss.com

In article <5908@skye.ed.ac.uk> jeff@aiai.UUCP (Jeff Dalton) writes:
>There is no equivalent supposition that humans have no understanding
>without sensors.  

Wanna bet?  You are claiming that a human could develop understanding
without ever having had sensory input?  Understanding of what, pray tell?
Would such a human pass the Turing test?

>Of course, sensors help in learning.  But if a
>person was in a Turing Test, the person can ignore everything except
>what's coming in on the teletype and still understand what's being
>said.  

Yes, by virtue of the incredible mass of information about the world
acquired by living in it.

>Of course, a computer might learn, with the aid of sensors, and
>then be put in the TT.  But the same kind of information could
>have been in the program from the start.  

In which case, there are grounds to argue that the program does have
a real semantics, since symbols can be referred to that mass of information
about the world.  (To put it another way-- Searle's assumptions are dubious.)


