From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!news-server.csri.toronto.edu!psych.toronto.edu!michael Thu Apr 16 11:33:40 EDT 1992
Article 5011 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: SHRDLU's mind
Organization: Department of Psychology, University of Toronto
References: <1992Apr6.023638.518@organpipe.uug.arizona.edu> <1992Apr7.203327.516@psych.toronto.edu> <1992Apr8.081207.1027@ccu.umanitoba.ca>
Message-ID: <1992Apr9.201157.16607@psych.toronto.edu>
Date: Thu, 9 Apr 1992 20:11:57 GMT

In article <1992Apr8.081207.1027@ccu.umanitoba.ca> zirdum@ccu.umanitoba.ca (Antun Zirdum) writes:
>In article <1992Apr7.203327.516@psych.toronto.edu> michael@psych.toronto.edu (Michael Gemar) writes:

[in response to "SHRDLU has a *little* mind..."]

>>OK, so what are the moral consequences of this belief?  
>>

>Moral consequences??? What is being talked about here?
>Are you telling me that you don't step on ants? You
>don't eat meat and plants? (hundreds of other examples
>omitted) I am not attempting to answer this for KE, but
>the moral consequences are the same as when you first
>realized as a baby that animals have feelings, but we
>still use them, we still consume them.

Well, Antun, that won't wash with me.  I'm vegetarian.

>Please make this clear, why do we attribute a moral
>agency to things with minds? My belief as to why we
>do this is because things with minds can be influenced
>in behaviour (NOT BECAUSE THEY INFLUENCE!), Thus a moral
>agent is a being that fits your mental model of that
>being. Thus a human is a moral agent, a animal is
>less so, and a rock is practically a non-agent.

This is nonsense.  Why wouldn't HAL fit your "mental model"
of a moral agent?

>	To be a little more explicit, the human being
>attempts to match his behaviour to what he thinks you
>expect. The animal does this to a degree less, and the
>rock does not attempt to change its behaviour in any
>way to match your mental model of itself.
>	Examples: You say "Nice weather", I reply
>"Sure is!" --> I am a moral agent.
>You say "Nice weather", I beat your brains in! --> I
>am not a moral agent!
>
>You say "nice doggy", the dog comes to you and whines.
>--> it is a partial moral agent.
>You say "nice rock", the rock does nothing! --> It is
>not a moral agent.
>
>To be a moral agent, it comes down to being capable
>of being such.

Now if this ain't circular, I don't know what is...

> Even the law takes agency into account
>with its insanity plea, in effect it is a plea that
>that person was not a moral agent.
>
>So what makes me more of a moral agent than the dog?
>In a word, capability! I have in mind what you expect
>of me, while the dog only has a partial knowledge of
>expectation, the rock has no such knowledge!

What prevents an instantiated program from having the crucial qualities
you identify?  If the answer is "nothing," then we have to worry about
the moral status of our creations.  

>(Incidently, if this sounds like some theory from
>somewhere, be assured that it is only coincidence.
>My thought processes are being transcribed directly!)

Hmm...I thought so...

- michael


