From newshub.ccs.yorku.ca!torn!cs.utexas.edu!uwm.edu!rpi!psinntp!psinntp!dg-rtp!sheol!throopw Fri Oct 30 15:18:08 EST 1992
Article 7434 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!uwm.edu!rpi!psinntp!psinntp!dg-rtp!sheol!throopw
>From: throopw@sheol.UUCP (Wayne Throop)
Newsgroups: comp.ai.philosophy
Subject: Re: grounding and the entity/environment boundary
Summary: clarification requested
Message-ID: <720241604@sheol.UUCP>
Date: 28 Oct 92 00:09:09 GMT
References: <718611244@sheol.UUCP> <1992Oct12.203359.8713@spss.com> <719720414@sheol.UUCP> <1992Oct23.161211.5628@spss.com>
Lines: 84

: From: markrose@spss.com (Mark Rosenfelder)
: Message-ID: <1992Oct23.161211.5628@spss.com>
: Actually I'm coming to believe that grounding does have to be kept up,
: though on a scale of years rather than hours.  Chris Malcolm's post on
: this about 10 days ago was very good.

Then could you revisit why someone who moves away from a 
town (or has eyes amputated, etc, etc) remains grounded?  The
previous response depended on the "grounding is instantaneous"
position, as near as I can tell.  (See also below.)

: Still, we've already said that groundedness might be a graded category,

Right.

: so I think we can describe some of the scenarios you describe, such as
: experience copied from another system or processes that time-share their
: senses, as defective in their grounding.

I don't follow this.  Why are these forms of grounding "defective"?
They (can) lead to identical capabilities, so I fail to see the defect.

: They have a limited sensorimotor capacity, 

Huh?  Just because the grounding is "borrowed" says nothing about
the sensorimotor capacity.  And certainly timeshared senses/effectors
is moot (at the very least in moderation), since humans (arguably)
don't use their senses and effectors full-time, and if some other
entity timeshared them, they'd pretty clearly be no worse off,
groundingwise.

: or they have no means to maintain their real-world experience,
: and their groundedness is correspondingly diminished.

Huh?  Then what about the moved-away-from-town case (as above),
and why wouldn't lower bandwidth sensor/effector access suffice
to maintain grounding even if it didn't suffice to originate it?

I'm very puzzled.

:: So, what does "direct experience of the world" really mean?  I propose
:: that ultimately it is an empty phrase, signifying nothing.
: Here is where I still disagree.  Fuzzy boundaries don't make a category
: disappear.  Fuzzy categories aren't really that hard to deal with; you base
: your thinking on the clear central members and modify your thinking to
: account for the fringes.

OK, so I overstated a tad.   (understatement icon goes here...)

: In the case at hand, normal humans and experienced
: intelligent robots are nicely grounded; stranger cases can be analyzed as
: appropriate as fully grounded, partly grounded, or not grounded at all.

Hmmmmm... I think this still doesn't address my objection.  The
stranger cases are not definite points along a spectrum, they are
*ambiguous*.  Is a compute engine connected via an ethernet to a camera
and some effectors a situation of a computer with defective grounding,
or a case of a robot with perfectly good grounding?

If one decides to consider the computer case, the compute engine's
symbols are ungrounded, and it is sending squiggles and squoggles
along the ethernet.  If one decides to consider the robot case, the
very same symbols in the very same compute engine ARE grounded, and
sensible meaningful information is symbolized by the packets flying
on the ethernet.

So, maybe the conclusion is that this 'grounding" isn't something that a
symbol system (or more finicky: a physical realization of a symbol
system) either "has" or "doesn't have", (nor yet even "maintains" or
"doesn't maintain"), but is another of those things that is purely in
the minds of beholders?

I can think of ways to make things more objective, like "a symbol
system is potentially grounded if there exists a physical system
including the symbol system that can be considered to have
wide-bandwidth senses which 'directly experience' the world", or some
such.  But doesn't this case include the computer- borrows- grounding-
via- download scenario?  (By just including the cameras and such at
'grounding central' along with the update diskettes mailed out to the
local machine, and so on and on all 'part of the system', for example.)
It seems so to me, so I'm not sure how to draw the line to make
"robots" be "grounded" and "computers" not, in any meaningful way.
--
Wayne Throop  ...!mcnc!dg-rtp!sheol!throopw


