From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Tue Jan 28 12:16:38 EST 1992
Article 3065 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: Strong AI and panpsychism (was Re: Virtual Person?)
Message-ID: <1992Jan23.192900.6823@psych.toronto.edu>
Organization: Department of Psychology, University of Toronto
References: <1992Jan23.015152.510@psych.toronto.edu> <1992Jan23.153722.6392@watdragon.waterloo.edu>
Date: Thu, 23 Jan 1992 19:29:00 GMT

In article <1992Jan23.153722.6392@watdragon.waterloo.edu> cpshelle@logos.waterloo.edu (cameron shelley) writes:
>michael@psych.toronto.edu (Michael Gemar) writes:
>[...]
>> It is this panpsychism which functionalism seems to imply which makes me
>> *very* nervous.  I will agree that the above is not a *logical* argument
>> against Strong AI, but it certainly should cause its advocates to pause and
>> consider to what, at root, their position commits them (the ethical problems
>> alone boggle the mind!).
>
>Indeed it isn't a *logical* argument, and it ignores the role
>*persistence* plays in describing what must in part be a processual
>phenomenon.  I may not be too clear by what is meant by functionalism
>here, but functional theories of language (which is the sort of thing
>I'm more familiar with) are not limited to a limp description of
>competence, but also describe performance and the relationship 
>between them.  (I think language theory is relevant here, since I
>assume language is an intelligent behaviour.)
>
>Indeed, persistence of information over time in any system is the key
>to describing its behaviour as coherent.  To take your example, a
>roomful of air might indeed `momentarily' form a mindlike structure
>(though the odds render this possibility remote), but that structure
>has no persistence (by virtue of the inherent randomness of air under
>normal conditions), and will not therefore be a `mind'.  Entropy 
>triumphs again!
>
>Any natural language program has to deal with this problem.  The
>solution is, of course, memory.  
>

Why is persistance important?  For that matter, how do you know that
*you* persist through time, except for memories (which could be "implanted"
in you).  (This point has come up recently in this newsgroup.)  I certainly
don't doubt that you believe that *you* have a mind.  If functionalism
is taken seriously, then a functionally equivalent arrangement of air
molecules, which would by definition include "memories", would also, for
a brief instant, have *exactly* the same experience that you have.

My concern in the above example is not with the ascription of behavioural
coherence by an outside observer, but with the *ACTUAL* existence of a mind,
something which, I hope, most people in this newsgroup believe is *not*
dependent upon the judgement of outside observers (the truth of whether
*I* have a mind is not at all dependent upon what others think).  To say
that such an arrangement as described above would be too transitory is to
first of all fail to argue against it in principle (for
such phenomena could *in principle* persist for an arbitrarily
long period), and to secondly confuse epistemology with ontology, namely
the issue of *how we would know* with the issue of *what is actually there*.

- michael
 





