Newsgroups: comp.ai.philosophy,sci.logic
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!news.alpha.net!uwm.edu!cs.utexas.edu!howland.reston.ans.net!pipex!uunet!psinntp!scylla!daryl
From: daryl@oracorp.com (Daryl McCullough)
Subject: Re: Deontic Logic. What is it?
Message-ID: <1995Feb19.015823.20180@oracorp.com>
Organization: Odyssey Research Associates, Inc.
Date: Sun, 19 Feb 1995 01:58:23 GMT
Lines: 109
Xref: glinda.oz.cs.cmu.edu comp.ai.philosophy:25641 sci.logic:9736

pautler@ils.nwu.edu (David Pautler) writes:

>daryl@oracorp.com (Daryl McCullough) wrote:
>
>> It should be the case that if you go help your neighbor, you should
>> tell him you are coming (instead of just dropping in uninvited). This
>> can be formalized as: 
>> 
>> 2. O(H -> T)
>> 
>> If you don't plan to help your neighbor, you really shouldn't lie and
>> say that you are going to. This can be formalized as (using ~ as
>> negation):
>> 
>> 3. ~H -> O(~T)
>
>Why didn't you write (3) as O(~H -> ~T)?

Because I meant something stronger. I meant: if it is the case
that you are not going to help (that is, if ~H holds), then you
are obligated not to tell your neighbor that you are going to help
(that is, O(~T)). So, ~H -> O(~T).

>The English statement of (3)
>is a semantic parallel of (2), allowing for negations, so the
>formulations should also be parallel.

No, the English statements are not parallel. The English version of
2 is:

   It should be the case that if you help, then you tell your
   neighbor you are going to help.

while the English version of 3 is

   If you do not help, you should refrain from telling your neighbor
   that you are going to help.

In sentence 2, the implication is inside the scope of the modal
operator "should", while in sentence 3, the implication is outside the
scope.

I agree that the asymmetry between 2 and 3 is a bit contrived, since
one could have worded them symmetrically. Anyway, regardless of how
we came up with assumptions 2 and 3, the question is to see why they
lead to a contradiction.

>Your formalization of (3)
>says something stronger: whether or not H is obliged, the fact of it
>obliges T. 

I think you mean "whether or not H is obliged, the fact of its
*negation* obliges *the negation of* T. I don't see what is wrong
with that. It seems to me to be an instance of the general obligation
not to lie: If something is false, then you are obliged not to say
that it is true.

>> I think that we can recover a kind of obligation operator
>> from a system of default reasoning as follows:
>> 
>> 
>>         O(A) (Meaning you are obligated to bring about A, or
>>               it is desirable to bring about A.)
>>         is defined to mean:
>>         w(A) > w(~A).
>> 
>> where w(A) means something like the most likely world (outcome) from
>> trying to bring about A", and > is a preference ordering on worlds.
>> That is, you should try to bring about A if the most likely result
>> of trying to bring about A is better than the most likely result of
>> trying to bring about its negation.
>
>Does the notion of an absolute preference make sense?

Beats me. I don't know why not.

>Does it make sense for use by a deontic logic?

I'm thinking the other way around. Does using deontic logic make any
sense? I think that a logic only makes sense to the extent that there
is a model theory behind it.

>Don't people actually infer
>preferences for actions they will perform from weighing obligations
>and desires?

I'm not talking how people derive their preferences, I'm only concerned
with how one can reason about preferences (whereever they come from).

>Starting from preferences would be like trying to guess
>if two numbers were prime, knowing only their sum.

I don't see the relevance of this analogy. What I am interested in is
what can we say about the preference for partial specifications of the
world given preferences for total specifications for the world. It's a
difficult problem, as is illustrated by the classic horror story "The
Monkey's Paw". In this story, there is a magic talisman, the monkey's
paw, which gives its possessor 3 wishes. The couple who finds the
talisman wish to become rich, and they get their wish in a most
gruesome manner (their son is killed in a freak accident and the
couple gets the insurance money). There is a problem with saying that
X is desirable, or X should be done, because getting X could have
undesirable side-effects.

Daryl McCullough
ORA Corp.
Ithaca, NY


