From newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!ames!olivea!netsys!pagesat!spssig.spss.com!markrose Sat Oct 24 20:44:58 EDT 1992
Article 7383 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!ames!olivea!netsys!pagesat!spssig.spss.com!markrose
>From: markrose@spss.com (Mark Rosenfelder)
Newsgroups: comp.ai.philosophy
Subject: Re: grounding and the entity/environment boundary
Message-ID: <1992Oct23.161211.5628@spss.com>
Date: 23 Oct 92 16:12:11 GMT
References: <718611244@sheol.UUCP> <1992Oct12.203359.8713@spss.com> <719720414@sheol.UUCP>
Sender: news@spss.com (Net News Admin)
Organization: SPSS Inc.
Lines: 36

In article <719720414@sheol.UUCP> throopw@sheol.UUCP (Wayne Throop) writes:

...oh, a lot of things, not much of which I really disagree with, except
for the last paragraph.   

>If it'd remain grounded with the camera removed (and presumably you'd
>agree that a human would remain grounded after having the eyes
>surgically removed (eg, due to retinal cancer or other adequate
>reason)), and if grounding isn't something that needs to be "kept up",
>then why did the camera ever need to be attached to that computer in
>the first place?  Certainly, a human can only "get grounded" in color
>as above by once having eyes, but the supposition that that's the
>only way seems wrong to me.

Actually I'm coming to believe that grounding does have to be kept up,
though on a scale of years rather than hours.  Chris Malcolm's post on 
this about 10 days ago was very good.

With AI we certainly get into puzzles we never faced when dealing only with
humans, such as whether you can borrow your grounding from someone else.
Still, we've already said that groundedness might be a graded category,
so I think we can describe some of the scenarios you describe, such as
experience copied from another system or processes that time-share their
senses, as defective in their grounding.  They have a limited sensorimotor
capacity, or they have no means to maintain their real-world experience,
and their groundedness is correspondingly diminished.

>So, what does "direct experience of the world" really mean?  I propose
>that ultimately it is an empty phrase, signifying nothing.

Here is where I still disagree.  Fuzzy boundaries don't make a category
disappear.  Fuzzy categories aren't really that hard to deal with; you base
your thinking on the clear central members and modify your thinking to
account for the fringes.  In the case at hand, normal humans and experienced 
intelligent robots are nicely grounded; stranger cases can be analyzed as
appropriate as fully grounded, partly grounded, or not grounded at all.


