From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!garrot.DMI.USherb.CA!uxa.ecn.bgu.edu!psuvax1!ukma!memstvx1!langston Wed Feb  5 11:55:43 EST 1992
Article 3348 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!garrot.DMI.USherb.CA!uxa.ecn.bgu.edu!psuvax1!ukma!memstvx1!langston
>From: langston@memstvx1.memst.edu
Newsgroups: comp.ai.philosophy
Subject: Re: Strong AI and panpsychism (was Re: Virtual person?)
Message-ID: <1992Jan31.173006.1194@memstvx1.memst.edu>
Date: 31 Jan 92 23:30:06 GMT
References: <1992Jan30.171309.1168@memstvx1.memst.edu> <1992Jan31.211540.13945@psych.toronto.edu>
Organization: Memphis State University
Lines: 74

In article <1992Jan31.211540.13945@psych.toronto.edu>, michael@psych.toronto.edu (Michael Gemar) writes:
> In article <1992Jan30.171309.1168@memstvx1.memst.edu> langston@memstvx1.memst.edu writes:
>>
>>  In the discussion of consciousness in this thread, are we adopting the
>>'objective' view of consciousness (as I try to argue), or a more third-
>>person, subjective view?  If the subjective experience is what is
>>being considered here as consciousness, I would argue that this could be
>>explained as an agent's (or an agent's processes) goal-directed behaviour
>>regarding incoming information.
> 
> It is not at all clear to me how "goal-directed behaviour" explains
> *experience*.   The notions of "agent" and "goal" *assume* the notion
> of consciousness, that which is to be explained.  I think that the
> above definition is a standard one when taking the "objective view"
> of consciousness, but it does nothing to explain experience.
> 
> - michael

   Okay.  Let's start by clarifying what I meant by 'agent'.  I consider an
agent to be any process or assemblage of processes that have some sort of
interaction with its environment (however sparse it may be).  This is a la
Minsky's agents, to draw a comparison.  By goal I was simply meaning to
convey the concept of intentional progress towards a desired state.
   Now then, before we all have a heyday with these vague generalizations, let
me elaborate a little.  Granted, very SIMPLE systems, when applied to my
above assertion re:consciousness, make the whole argument sound silly.
In this instance I stand firm against such systems as thermostats.
   However, what I am trying to get at is an explanation of the subjective
experiences of a VERY COMPLEX system.  For example, I would assert that the
lightwaves that you call 'red' and the ones I call 'red' are two seperate
descriptions of the light waves.  We are taking in the same information, yet
we are perceiving it differently.  
   Now then.  Consider a random scattering of pixels on your CRT.  This pattern
probably does not hold any information for you.  But, if you examined the
pattern with the goal of determining the total number of illuminated pixels,
the pattern suddenly becomes meaningful to you.  If you approached it with the
goal of connecting the dots, the pattern is again meaningful, but this time
in a different way.
   Another example.  I (personally) can't listen to free-form jazz for very
long.  It all sounds like noise to me.  But to someone who understands the
music, the seemingly 'random' patterns of tone and rhythm suddenly make sense.
   In the above examples, each individual (agent) is operating on the same
external stimuli, and depending upon his/her/its goal in interpreting the
stimuli, develops a distintly subjective view of the stimuli.
(I am probably slaughtering D. Chalmers' pattern-information theory and its
many-many relationship, but I'm doing my best, David :^)  )
   
   Whether the 'goal' (again my broadened conception of the term) of the
agent is making use of the incoming information or not will also have an
impact on the subjective experience of the agent re:that information.
   For example,  imagine you are in a crowded room, full of people talking.
If you are engaged in a conversation (your 'goal'), all the other discussion
is noise to you.  If someone calls your name from across the room,
(let's assume we all have a resident 'goal' of attending to our names when we
hear them called), your signal (the converstaion) becomes noise, and part of
the previous 'noise
' becomes signal.

   I'm not quite sure how to verbalize my entire idea here.  Basically what
I am trying to convery here is that two identical agents (theoretically, now...
I know that isn't possible at a complex scale...) subjected to identical
external stimuli, will have different subjective experiences depending upon
the active goals of the two agents.  And, if we are defining 'consciousness'
as the subjective experience (as I think I have seen a few do), then this
is my position on what that 'consciousness' is.
 
-- 

Mark C. Langston                                  "What concerns me is not the
Psychology Department                              way things are, but rather
Memphis State University                           the way people think things
LANGSTON@MEMSTVX1.MEMST.EDU                        are."     -Epictetus

     "...a brighter tomorrow?!?  How about a better TODAY?"  -me


