From newshub.ccs.yorku.ca!torn!utcsri!rutgers!gatech!usenet.ins.cwru.edu!agate!netsys!pagesat!spssig.spss.com!markrose Fri Oct 30 15:18:14 EST 1992
Article 7440 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!utcsri!rutgers!gatech!usenet.ins.cwru.edu!agate!netsys!pagesat!spssig.spss.com!markrose
>From: markrose@spss.com (Mark Rosenfelder)
Newsgroups: comp.ai.philosophy
Subject: Re: grounding and the entity/environment boundary
Message-ID: <markrose.720385670@spssig>
Date: 29 Oct 92 19:07:50 GMT
References: <719720414@sheol.UUCP> <1992Oct23.161211.5628@spss.com> <720241604@sheol.UUCP>
Sender: news@spss.com (Net News Admin)
Organization: SPSS Inc.
Lines: 102

In article <720241604@sheol.UUCP> throopw@sheol.UUCP (Wayne Throop) writes:
>: From: markrose@spss.com (Mark Rosenfelder)
>: Actually I'm coming to believe that grounding does have to be kept up,
>: though on a scale of years rather than hours.  Chris Malcolm's post on
>: this [13 Oct] was very good.
>
>Then could you revisit why someone who moves away from a 
>town (or has eyes amputated, etc, etc) remains grounded?  

Because the deteroriation of grounding takes place over years or decades.
If you move to another city, statements you make about your home town are
still grounded in a) your memories of that town, as well as b) your continuing
experiences (i.e. you still deal with human beings, houses, trees, animals).

I can easily believe that grounding would deteriorate faster if b) were not
present-- e.g. the chap confined for years to a sensory deprivation tank.

>: so I think we can describe some of the scenarios you describe, such as
>: experience copied from another system or processes that time-share their
>: senses, as defective in their grounding.
>
>I don't follow this.  Why are these forms of grounding "defective"?
>They (can) lead to identical capabilities, so I fail to see the defect.

I think you do see the defect; that's why you threw in that "(can)"...

Suppose your robot is trying to get itself grounded, but has only time-shared
senses.  It's going to take much longer to accumulate the necessary 
experience with the world, in inverse proportion to its share of the pooled
senses.  In a given time period, it develops *less* grounding than a robot 
which had dedicated senses.  Ergo its capabilities are *not* identical, and 
its grounding is to that extent defective.  QED.

I see no problem with one robot acquiring its grounding from another.
I am not sure I see what it means, however, for a computer to acquire
grounding from a robot.  A huge proportion of the human brain is devoted
to sense interpretation and motor control, and surely the remainder is
so to speak built on top of them.  Why should we expect the computer, which 
interacts with the world not at all or only by teletype, to be able to use
a mass of knowledge which is designed for robotic real-world experience?
If the computer's algorithm is not designed around real-world interaction,
I don't see that it can be clearly described as grounded; if it is, it is
like the man confined to a sensory deprivation tank: in danger, I would
think, of going insane.

>: or they have no means to maintain their real-world experience,
>: and their groundedness is correspondingly diminished.
>
>Huh?  Then what about the moved-away-from-town case (as above),
>and why wouldn't lower bandwidth sensor/effector access suffice
>to maintain grounding even if it didn't suffice to originate it?

OK, let's say that X moved out of town when it was mostly agricultural,
had a downtown full of Mom-n-Pop stores, and a social life revolving around 
the VFW.  Since then the farms have been bought up by big companies, malls 
have opened out of town, most people work in factories, the malls, or in the
nearby city.  The old downtown has been outfitted with red brick and lampposts 
to attract yuppies.  X has heard about all this from letters, but hasn't 
been back or even seen photos of the old town.

Now, it seems absurd to me to maintain that X knows the town as well as
he used to-- knows what things look like, knows what it's like to live there.
So no, low-bandwidth sources aren't enough to keep your grounding up to date.

He still has his memories, though, and as I said above these diminish
only over years or decades.  When he talks about what it *used* to be like
to live in his home town, his statements are quite meaningful.

There was an interesting article by Oliver Sacks in the New Yorker recently,
about a situation much like this.  There's a painter who has produced dozens
of fantastically detailed and accurate paintings of the Italian town
where he grew up, and which he hadn't seen for decades.  Seeing his pictures
is a lesson in the sheer quantity of information humans store in their brains.

>I can think of ways to make things more objective, like "a symbol
>system is potentially grounded if there exists a physical system
>including the symbol system that can be considered to have
>wide-bandwidth senses which 'directly experience' the world", or some
>such.  

Actually that's not bad.  The "potentially" and "can be considered" 
are a bit waffly though.  

>But doesn't this case include the computer- borrows- grounding-
>via- download scenario?  (By just including the cameras and such at
>'grounding central' along with the update diskettes mailed out to the
>local machine, and so on and on all 'part of the system', for example.)
>It seems so to me, so I'm not sure how to draw the line to make
>"robots" be "grounded" and "computers" not, in any meaningful way.

You mean, are our statements about computer vs. robot cognition grounded?  
Arguably not, since we don't have any experience with real AIs.
Both of us may be very embarrassed to re-read our remarks in 30 years.

- 

All I am insisting on, really, is that human intelligence developed for
specific purposes in the real world, and remains rooted in that world.  If 
this weren't true of an AI its intelligence if any would be very unhumanlike.
This is, however, heresy for the hoary but influential philosophical position
that human reason is transcendant, abstract, and uncontaminated by nature or 
culture.  


