From newshub.ccs.yorku.ca!torn!cs.utexas.edu!uunet!secapl!Cookie!frank Fri Oct 30 15:17:48 EST 1992
Article 7409 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!uunet!secapl!Cookie!frank
>From: frank@Cookie.secapl.com (Frank Adams)
Subject: Re: grounding and the entity/environment boundary
Message-ID: <1992Oct27.205040.117959@Cookie.secapl.com>
Date: Tue, 27 Oct 1992 20:50:40 GMT
References: <1992Oct12.203359.8713@spss.com> <719720414@sheol.UUCP> <1992Oct23.161211.5628@spss.com>
Organization: Security APL, Inc.
Lines: 35

In article <1992Oct23.161211.5628@spss.com> markrose@spss.com (Mark Rosenfelder) writes:
>In article <719720414@sheol.UUCP> throopw@sheol.UUCP (Wayne Throop) writes:
>...oh, a lot of things, not much of which I really disagree with, except
>for the last paragraph.   
>
>>If it'd remain grounded with the camera removed (and presumably you'd
>>agree that a human would remain grounded after having the eyes
>>surgically removed (eg, due to retinal cancer or other adequate
>>reason)), and if grounding isn't something that needs to be "kept up",
>>then why did the camera ever need to be attached to that computer in
>>the first place?  Certainly, a human can only "get grounded" in color
>>as above by once having eyes, but the supposition that that's the
>>only way seems wrong to me.
>
>Actually I'm coming to believe that grounding does have to be kept up,
>though on a scale of years rather than hours.  Chris Malcolm's post on 
>this about 10 days ago was very good.

This ignores the difference between human memory, which deteriorates over
time, and computer memory, which need not (at least not on comparable time
scales).

>With AI we certainly get into puzzles we never faced when dealing only with
>humans, such as whether you can borrow your grounding from someone else.
>Still, we've already said that groundedness might be a graded category,
>so I think we can describe some of the scenarios you describe, such as
>experience copied from another system or processes that time-share their
>senses, as defective in their grounding.  They have a limited sensorimotor
>capacity, or they have no means to maintain their real-world experience,
>and their groundedness is correspondingly diminished.

But if the system has a good enough memory that it doesn't need to maintain
its grounding, in what way is the grounding diminished?




