From newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!ames!haven.umd.edu!darwin.sura.net!convex!news.oc.com!spssig.spss.com!markrose Wed Oct 14 14:58:57 EDT 1992
Article 7240 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!ames!haven.umd.edu!darwin.sura.net!convex!news.oc.com!spssig.spss.com!markrose
>From: markrose@spss.com (Mark Rosenfelder)
Subject: Re: Grounding
Message-ID: <1992Oct12.203359.8713@spss.com>
Sender: news@spss.com (Net News Admin)
Organization: SPSS Inc.
References: <1asq47INNr9o@smaug.West.Sun.COM> <1992Oct5.195433.9320@spss.com> <718611244@sheol.UUCP>
Date: Mon, 12 Oct 1992 20:33:59 GMT
Lines: 58

In article <718611244@sheol.UUCP> throopw@sheol.UUCP (Wayne Throop) writes:
>: From: markrose@spss.com (Mark Rosenfelder)
>: Message-ID: <1992Oct5.195433.9320@spss.com>
>: The problem is, however, that from
>: the event the computer can learn effectively nothing about the keypress, or
>: about keys, or about keyboards.
>
>My contention is that the computer's knowledge can be innate, (or more
>specifically, it can aquire knowledge about the objects and events that
>it can discriminate via its keyboard/scanner/microphone/whatnot by way
>of its ethernet port, disk drive, serial port, ROM, and so on). 

And my contention is that, with most of these devices, it *can't.*  
The information to learn about keys through the hardware connection 
to the keyboard (whatever form that may take), much less to learn about
fingers, books, cats, or whatever else hits the keyboard, just isn't there.

>I think an entity can remain grounded through (pretty much) an arbitrarily
>small "bandwidth keyhole".  For example, consider a person who grows up
>in some small town.  The person is grounded, knows the meaning of the
>symbols refering to residents, buildings, locations, etc.  The person
>moves to Timbuktu, and corresponds with a sibling that remains behind.
>I claim that the person in Timbuktu is still grounded, and still means
>things by the symbols sent to the stay-ah-home sibling.  Further, the
>remote sibling even means things by the symbols used to refer to buildings
>built after the move to Timbuktu.  The person was grounded, and then
>*stays* grounded via a very small "symbolic" keyhole.

No, he was *and remains* grounded due to his extensive sensorimotor 
experience in the small town.  The "small bandwidth" of connection to the
town once he moves out is irrelevant: he would remain grounded (his speech
would continue to be meaningful) even if he spent most of his time in a
sensory deprivation tank.  

Grounding isn't something like a credit rating or a membership in the ACLU, 
which you have to actively maintain.  Once you've got it you keep it.  

So your example doesn't show that grounding is possible with an extremely
narrow sensory bandwidth.

>It seems clear to me that on a short timescale, and within reasonable
>limits, the bandwidth of the senses doesn't seem to affect the "amount
>of groundedness" (eg: I don't feel I mean less by the words I utter with
>my eyes closed as compared with them open).  And while it's a wild
>extrapolation, I don't think a longer timescale or wider variance of
>bandwidth will be a fundamental problem. 

Groundedness doesn't diminish one whit when you close your eyes, any more
than your capacity for vision does.  What could change things is if you grew
up blind: then you'd lack grounding in (say) colors.

If you want a computer whose statements about color are grounded, you must
provide it with something like a color TV camera, an algorithm which can
acquire experience with the world using it, and time to do so.  If you've
done all that, I think I'd allow that it remains grounded even if you
remove the camera (tho' if the robot is conscious that might not be a
very nice thing to do...)



