From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!uwm.edu!cs.utexas.edu!qt.cs.utexas.edu!yale.edu!jvnc.net!darwin.sura.net!europa.asd.contel.com!uunet!psinntp!norton!brian Wed Feb  5 11:55:47 EST 1992
Article 3355 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!uwm.edu!cs.utexas.edu!qt.cs.utexas.edu!yale.edu!jvnc.net!darwin.sura.net!europa.asd.contel.com!uunet!psinntp!norton!brian
>From: brian@norton.com (Brian Yoder)
Subject: Re: Strong AI and Panpsychism
Message-ID: <1992Feb01.025045.455@norton.com>
Organization: Symantec / Peter Norton
References: <1992Jan29.194537.3658@colorado.edu>
Date: Sat, 01 Feb 1992 02:50:45 GMT

tesar@tigger.Colorado.EDU (Bruce Tesar) writes:
 
>     I'm not sure how useful a statement like the above is, unless the
> various moral philosophies come equipped with useful definitions of
> conciousness. In your case, you might want to figure out which is the
> cart and which is the horse. Does it make sense to base your entire
> moral system on consciousness, and THEN decide what consciousness is?

It certainly is a case of getting the cart before the horse.  Before you can
draw any conclusions about morality to need at least to have a theory of 
epistemology, otherwise how could you even know whether there are any facts 
which can be known?  Or how to derive them?  Or Whether you can be certain 
they are true?  Or what certainty and truth are?  Likewise, you can't develop
a theory of epistemology without first determining some things about reality.  
Metaphysical questions like "Does anything exist at all?" or "Is reality just
an illusion generated by my mind?" or "Can contradictions exist?" are the basis 
for any epistemological theory.  Jumping in at the middle allows unstated 
assumptions and unproven premises to invade your system of thought and 
cast you into hopeless contradictions.
 
>     As for alternatives, what is wrong with simply declaring that
> HUMAN BEINGS are the agents worthy of moral concern? The category is
> well-recognized in all cultures, and even has a sound scientific meaning.
> A computer, no matter how intelligent and/or conscious it becomes, is
> still not a human being.

To answer that question it is necessary to determine what the purpose for morality
is.  Without the answer to that question, the question "Is this moral position
a correct application of morality?" cannot be answered.

>     Are the comatose and the severely retarded less worthy of moral
> consideration than fully functional humans?

No, they are not, but without understanding the roots of the ideas here neither
the question nor the answer make any sense.  The immediate question of "Why should
one be moral?" has to be answered to allow the answer to that question.  Some
would say that "The purpose of morality is to make God happy." If that were the case
all one would have to do to determine if something was immoral would be to consult
scriptures or pray.  Others would say that "The purpose of morality is the minimization
of pain in society.".  If that were true you would have to measure the pain the
comatose or retarded person is in and act accordingly.  Still others would
say "Morality is just an arbitrary social choice." This kind of person would
just have to go take a poll to determine what is moral.  My answer to the 
question would be "Morality is a set of principles to guide me in living my life."
So to answer the question I would have to determine the effects of ignoring the
moral status of such people.  Can you see how the answer to this more 
fundamental question is necessary to draw the higher-level conclusion?

To add a side issue for a moment, the moral status of an individual is 
predicated on his being able to make choices between right and wrong.  We don't
say that a car is immoral because it hit a pedestrian.  We don't say that a
comatose patient who during convulsions strikes a nurse is immoral do we?
That's because he's not a moral agent and that is because he can't choose his
actions.

--Brian
 
-- 
-- Brian K. Yoder (brian@norton.com) - Q: What do you get when you cross     --
-- Peter Norton Computing Group      -    Apple & IBM?                       --
-- Symantec Corporation              - A: IBM.                               --
--


