From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Mon May 25 14:06:54 EDT 1992
Article 5816 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: Grounding: Real vs. Virtual (formerly "on meaning")
Organization: Department of Psychology, University of Toronto
References: <1992May20.030811.13711@mp.cs.niu.edu> <1992May20.150243.25894@psych.toronto.edu> <1992May20.191738.18644@mp.cs.niu.edu>
Message-ID: <1992May21.173906.22368@psych.toronto.edu>
Keywords: symbol, analog, Turing Test, robotics
Date: Thu, 21 May 1992 17:39:06 GMT

In article <1992May20.191738.18644@mp.cs.niu.edu> rickert@mp.cs.niu.edu (Neil Rickert) writes:

>  When you say 'In principle, you could have a "brain-in-a-vat"' you are
>sweeping a great deal under the rug.  Perhaps you are also sweeping all
>of Harnad's ideas under the rug too.

This seems to be Harnad's view as well, although I am not convinced.

>  Even if you ignore the practical problems of getting the brain into the
>vat, it may not work.  I suspect we make a serious mistake by trying to
>locate the mind within the brain.  There are probably information paths which
>are not fully within the brain, yet are necessary for its function.
>Thus a brain signal may stimulate an organ to release hormone, which may
>effect another organ, which may effect blood chemistry and the chemical
>environment of the brain.  Information paths of this form may well be
>essential.

OK, so put artificial hormone releasers in the vat as well.  Heck, I
don't care *what* goes in the vat, as long as it isn't transducers.  I
don't think your objections are fatal to the position.

>>That is, it would never see a cat, but only the image of a cat. Thus, its
>>tokening of "cat" owuld not refer to cats.  It would never feel a scratch 
>>on its arm, but only the "image" of a scratch on its arm. 
>
>  It can be argued that you never see a cat now, either, but only the image
>of a cat.  In other words, what you perceive of vision is perhaps already
>better thought of as a virtual reality, created by the brain as a way of
>integrating input from the two eyes, perhaps from other sensory organs, and
>information from memory.

Indeed one could.  Which makes the importance of "real" world grounding
dubious at best.  (To be fair, Harnad clearly does *not* require real world
grounding, only real transducers.)

- michael 


