From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!ccu.umanitoba.ca!zirdum Thu Apr 16 11:33:20 EDT 1992
Article 4974 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!ccu.umanitoba.ca!zirdum
>From: zirdum@ccu.umanitoba.ca (Antun Zirdum)
Newsgroups: comp.ai.philosophy
Subject: Re: SHRDLU's mind
Message-ID: <1992Apr8.081207.1027@ccu.umanitoba.ca>
Date: 8 Apr 92 08:12:07 GMT
References: <1992Apr5.210553.11966@psych.toronto.edu> <1992Apr6.023638.518@organpipe.uug.arizona.edu> <1992Apr7.203327.516@psych.toronto.edu>
Organization: University of Manitoba, Winnipeg, Manitoba, Canada
Lines: 65

In article <1992Apr7.203327.516@psych.toronto.edu> michael@psych.toronto.edu (Michael Gemar) writes:
>In article <1992Apr6.023638.518@organpipe.uug.arizona.edu> bill@NSMA.AriZonA.EdU (Bill Skaggs) writes:
>>Kristoffer Eriksson:
>>>Surely, no-one has suggested that SHRDLU is advanced enough to have a mind?
>>
>>Christopher Green:
>>>Under strong AI, one would be committed to such a view. Surely, if McCarthy
>>>believes is thermostat has beliefs, he believes that SHRDLU does. Same
>>>goes for any other thorough-going functionalists. Right Dave...?
>>>
>>
>>  I think I'm a backer of strong AI, but I don't believe that
>>mind is an all-or-nothing concept.  Minds come in varying degrees
>
>OK, so what are the moral consequences of this belief?  
>
Moral consequences??? What is being talked about here?
Are you telling me that you don't step on ants? You
don't eat meat and plants? (hundreds of other examples
omitted) I am not attempting to answer this for KE, but
the moral consequences are the same as when you first
realized as a baby that animals have feelings, but we
still use them, we still consume them.
Please make this clear, why do we attribute a moral
agency to things with minds? My belief as to why we
do this is because things with minds can be influenced
in behaviour (NOT BECAUSE THEY INFLUENCE!), Thus a moral
agent is a being that fits your mental model of that
being. Thus a human is a moral agent, a animal is
less so, and a rock is practically a non-agent.
	To be a little more explicit, the human being
attempts to match his behaviour to what he thinks you
expect. The animal does this to a degree less, and the
rock does not attempt to change its behaviour in any
way to match your mental model of itself.
	Examples: You say "Nice weather", I reply
"Sure is!" --> I am a moral agent.
You say "Nice weather", I beat your brains in! --> I
am not a moral agent!

You say "nice doggy", the dog comes to you and whines.
--> it is a partial moral agent.
You say "nice rock", the rock does nothing! --> It is
not a moral agent.

To be a moral agent, it comes down to being capable
of being such. Even the law takes agency into account
with its insanity plea, in effect it is a plea that
that person was not a moral agent.

So what makes me more of a moral agent than the dog?
In a word, capability! I have in mind what you expect
of me, while the dog only has a partial knowledge of
expectation, the rock has no such knowledge!
(Incidently, if this sounds like some theory from
somewhere, be assured that it is only coincidence.
My thought processes are being transcribed directly!)
>- michael 


-- 
*****************************************************************
*   AZ    -- zirdum@ccu.umanitoba.ca                            *
*     " The first hundred years are the hardest! " - W. Mizner  *
*****************************************************************


