From newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!olivea!spool.mu.edu!agate!doc.ic.ac.uk!uknet!edcastle!cam Thu Oct  8 10:11:25 EDT 1992
Article 7133 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!olivea!spool.mu.edu!agate!doc.ic.ac.uk!uknet!edcastle!cam
>From: cam@castle.ed.ac.uk (Chris Malcolm)
Newsgroups: comp.ai.philosophy
Subject: Re: Grounding
Message-ID: <26604@castle.ed.ac.uk>
Date: 6 Oct 92 17:49:29 GMT
References: <1992Sep29.234928.15758@spss.com> <718221542@sheol.UUCP> <1992Oct5.195433.9320@spss.com>
Organization: Edinburgh University
Lines: 43

In article <1992Oct5.195433.9320@spss.com> markrose@spss.com (Mark Rosenfelder) writes:
>In article <718221542@sheol.UUCP> throopw@sheol.UUCP (Wayne Throop) writes:

>>Grounding, it still seems to me, can't be due to "transduction" or
>>"non-symbolic-ness" or whatnot, because humans and computers are
>>equivalent on these grounds.  It is only a persuasive illusion that
>>computers are "all symbolic" and humans have "non-symbolic" natures. 
>>The illusion is persuasive because myriads of hard-working and
>>intelligent hardware and software engineers have labored to perfect 
>>this illusion. 

I agree. It's interesting to note in this connection, moreover, that
it is a commonplace illusion among philosophy students that they _are_
entirely symbolic in their cognitive functions, e.g., the common
belief that is impossible to think without thinking in (something
like) words.

>How *can* the entity discriminate objects if it lacks senses and a mass
>of experience with the senses and the objects?  What, besides experience,
>can provide any link between objects (meaning things outside the system)
>and the entity's internal structure?

Experiments have been done on this in animals. For example, putting a
kitten in a cart so that instead of walking around by itself it gets
dragged around by a sibling. Both kittens have similar visual
experiences of their surroundings, but the one in the cart grows up
deficient in its visual understanding.

This illustrates my point that a history of sensory input is not
enough to acquire and maintain a grounded symbol system. It is crucial
that the sensory input was affected by effector output, in other
words, developing the amalgam of sensing and action we call
"behaviour" is the crucial thing. Because behaviour is an amalgam of
sensing and action, to discuss grounding as though it were a fait
accompli (rather than a continuous process), and largely in terms of
how sensory input (rather than behaviour) is processed, leads to a
rather fragile concept of grounding, which begins to smell very ad hoc
when called upon to separate the grounded wheat from the ungrounded
chaff in various kinds of robot and computer systems.
-- 
Chris Malcolm    cam@uk.ac.ed.aifh          +44 (0)31 650 3085
Department of Artificial Intelligence,    Edinburgh University
5 Forrest Hill, Edinburgh, EH1 2QL, UK                DoD #205


