From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Mon May 25 14:06:36 EDT 1992
Article 5784 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: Grounding: Real vs. Virtual (formerly "on meaning")
Organization: Department of Psychology, University of Toronto
References: <1992May19.003821.9450@Princeton.EDU> <1992May19.220141.29649@psych.toronto.edu> <1992May20.043004.2732@news.acns.nwu.edu>
Message-ID: <1992May20.194807.9665@psych.toronto.edu>
Keywords: symbol, analog, Turing Test, robotics
Date: Wed, 20 May 1992 19:48:07 GMT

In article <1992May20.043004.2732@news.acns.nwu.edu> learn@speedy.acns.nwu.edu (William J. Vajk) writes:
>In article <1992May19.220141.29649@psych.toronto.edu> Michael Gemar writes:

>>You [Harnad] seem to place great weight on the analog nature of the physical
>>world; indeed, it seems as though it is this aspect upon which
>>you rest symbol grounding.
>
>>Atoms are certainly discrete particles, and I believe that some 
>>physicists have theorized that space and time may be discrete.
>
>This has become a matter of routine demonstration for almost every 
>experiment conducted in recent years. The general perception of
>an analog nature remains unchanged.

Indeed, although, as I point out, this is *only* due to the relative 
insensitivity of our perceptual apparatus, which, as far as I can see,
has no philosophical import in this argument.

>>5. Statements 3. and 4. taken together seem to indicate that there
>>   is a principled difference in the way that humans and programs could
>>   obtain semantics.  Specifically, it points to a situation in which
>>   what is sufficient for humans is not sufficient for a program.  The
>>   grounding of symbols in the physical world, while not sufficient
>>   for a program, is sufficient for a human.  Therefore, symbol
>>   grounding in the physical world cannot be the cause of the     
>>   meaningfulness of symbols. 
>
>You seem to have developed a closed system here in which the key of 
>equivalence isn't. The definition of equivalence, after all, is 
>similarity of response to a given stimulus. It seems to me as though
>you've just said, "1 != 1". I don't believe the problem is one in
>which grounding isn't the key so much as the failure to accurately
>replicate.

I'm not sure what you mean here, Bill.  I am happy to argue that both the
program and the human might *act* equivalently in the virtual system -
the discussion is, in part, about the validity of behavioural methods for
determining mentality.  But, if one accepts, like Harnad does, that pure
symbol manipulation can't produce semantics, then we seem to be driven
to the point where, even though a program and human might *act* the same,
they are *not* equivalent when it comes to semantics - the latter has it,
the former doesn't.  This *is* the grounding problem. 

- michael
 



