From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!utgpu!watserv1!watdragon!logos.waterloo.edu!cpshelle Wed Feb  5 11:56:30 EST 1992
Article 3429 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!utgpu!watserv1!watdragon!logos.waterloo.edu!cpshelle
>From: cpshelle@logos.waterloo.edu (cameron shelley)
Subject: Re: Strong AI and panpsychism (was Re: Virtual Person?)
Message-ID: <1992Feb3.145017.27440@watdragon.waterloo.edu>
Sender: news@watdragon.waterloo.edu (USENET News System)
Organization: Evil Designs Inc.
References: <1992Jan31.193524.28969@psych.toronto.edu>
Date: Mon, 3 Feb 1992 14:50:17 GMT
Lines: 118

michael@psych.toronto.edu (Michael Gemar) writes:
> Cam and I go at it again, but I think we're getting closer...

Let's hope so!

> But I think it *does* clarify things.  SHRDLU "lived" in an artificial
> environment that was part of the program.  Any other material system could
> reproduce this program *with its environment*, including a roomfull of
> air.  Voila!  Situated intelligence in a tornado.  :-)
> 
> My point is that inputs and outputs need not correspond to the real
> world in order to produce situatedness.  I don't believe that consciousness
> existing in a void of stimuli would be very interesting.  But this situation
> is not necessary in the cases that I proposed.  If you can program the
> environment, as in virtual reality, then you can produce "situatedness"
> in a formal system.  

I concur.  (I thought I had in my last posting!) But as I mentioned
before, this just a form of indirection which imposes a certain level
of description on `situation'.  From Minsky's posting (later in this
thread), I take it I've been put into a camp of philosophy regarding
the so-called symbol-grounding problem, which apparently regards
situatedness as a law-like isomorphism between levels of description. 
I would like to state for the record that I don't share this view, but
I think this is part of the miscommunication we've been experiencing
in our consciousnesses during exposure to this discussion.  ;-)
Please, do not adjust your mindset! 

I am happy with virtual reality as a situatedness for an appropriate
formal system.  The issue, I think, then is the adequacy of the level
of representation to fit the observable phenomena of consciousness
(which are not certain).  Anyhow... 

> Well, currently, subjective criteria are the only certain ones we
> have for consciousness.  If you are arguing that an environment is
> necessary for consciousness, I might still disagree, but I don't think
> that, in the cases I talked about earlier, it is necessary to assume
> that a virtual environment is not possible.

I am arguing that an environment is necessary for consciousness, and
hopefully we both see that I didn't make the above assertion about
virtual environments.

> >Having a brain gives me the opportunity of having conscious experience,
> >but brains haven't arisen spontaneously.  Billions of years of
> >evolution (read interaction between genetic persistence and 
> >environment) have gone into each one.  Putting one in a vat doesn't
> >suddenly negate this fact.
> 
> This is certainly true, but I don't understand your point here.

More of the argument for situatedness: non-local with regard to space
and time---as you might expect.  My contention is that this, along
with material, is necessary for consciousness, by any standard I
think we could recognize.

> You could mean two things here:
> 1) matter is necessary for consciousness
> 2) stimuli from "outside" an entity is necessary for consciousness
> 
> I would not deny 1).  I am happy to be a provisional materialist.
> 
> As for 2), I am not certain that outside stimulation is necessary (I haven't
> finished thinking about that).  However, both you and I seem to believe that
> "virtual" stimuli are enough, and that stimuli need not have connection to
> the "real world".  If you grant this, then you have (if I understand you)
> the situatedness you desire.

I think we understand one another.  My quibble is still with your
previous assertion: there is a non-zero chance of a collection of
material attaining even a transitory conscious structure.  I find this
an extremely weak basis for general statements about consciousness,
and therefore for any eithical considerations based on such
generalities.  Your remarks about intelligent rocks and tornados has
not assuaged this concern any. 


> Would you follow after Chalmers in asserting that atoms therefore have
> "limited" consciousness?  And if all you ask for is a change in state
> provoked by the "outside", then practically *any* physical feature
> can generate "consciousness".  This to me makes consciousness a meaningless
> term.  If atoms can have subjective experiences, if there is something that
> it is like to be an atom, then I think that the notion of mentality become
> vacuous.

I recall answering this exact question in the article previous to this one!
If we take environment seriously as a constraint on consciousness, then
atoms and galaxies are even less likely than bodies of air to be ascribed
consciousness.  I invite you to review my previous posting containing this
observation.  You also did not answer my question about whether you take
consciousness to be binary.  Your postings in the past indicate you do
have a binary view of the issue, and a deterministic requirement for
knowledge.  I cannot agree with either, so part of our disagreement may
simply be varying assumptions regarding these points.

> Well, if I kill you, I am not depriving the lump of matter that is your
> body (or your brain, for that matter) experiences, since this system will still
> "experience" the outside and react by changing its internal state (decaying).
> So any ethical system based on the inherent worth of experience is out
> the window.  You may argue that the experiences that "you" experience are
> more worthwhile than those of simply your decaying body, but you'd have
> to provide an argument for that.
> 
> I also think that notions such a personal identity may also be compromised
> by this kind of pan-experiencism, but I haven't clarified my objections
> quite yet.

Well, as I indicated above, I don't think statements based simply on 
non-zero probabilities of events is much of a basis for ethical argument.

Gotta go!

				Cam
--
      Cameron Shelley        | "Syllogism, n.  A logical formula consisting
cpshelle@logos.waterloo.edu  |  of a major and a minor assumption and an
    Davis Centre Rm 2136     |  inconsequent."
 Phone (519) 885-1211 x3390  |				Ambrose Bierce


