From newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!olivea!uunet!munnari.oz.au!comp.vuw.ac.nz!waikato.ac.nz!aukuni.ac.nz!kcbbs!nacjack!codewks!system Mon Nov  9 09:36:54 EST 1992
Article 7527 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!olivea!uunet!munnari.oz.au!comp.vuw.ac.nz!waikato.ac.nz!aukuni.ac.nz!kcbbs!nacjack!codewks!system
>From: system@CODEWKS.nacjack.gen.nz (Wayne McDougall)
Newsgroups: comp.ai.philosophy
Subject: Re: The Paradox of the Unexpected Hanging
Message-ID: <m20TTB5w165w@CODEWKS.nacjack.gen.nz>
Date: 7 Nov 92 13:02:21 GMT
References: <1992Nov3.051001.21374@oracorp.com>
Organization: The Code Works Limited, PO Box 10 155, Auckland, New Zealand
Lines: 40

daryl@oracorp.com (Daryl McCullough) writes:

> The statement "You will be executed today, but you will not be able to
> figure out that you will be executed today" is *not*
> self-contradictory. Suppose the prisoner thinks the judge might be
> lying, then he can't figure out anything from what the judge says.
> Therefore the judge's second statement "...you will not be able to
> figure out that you will be executed today" will turn out to be true.
> If in addition, the judge *does* hang the prisoner, then the first
> statement "You will be executed today" will turn out to be true. If
> everything the judge says turns out to be true, then it can't be
> self-contradictory.

I don't see how this can work. If the prisoner thinks the judge might 
be lying in the first part, he might conclude that the judge is lying 
about the second part also, that is that he can figure out whether he 
will be executed today. Its easy if he assumes the whole sentence is a 
lie; he can figure out that we won't be executed today, obviously. But 
that is an unreasonable assumption outside these little problems (which 
are often prefaced with X always tells the truth, or Y always lies). 
Surely a "judge" is shorthand for "someone reliable who we can expect 
to always tell the turth in artificial problems".

Surely, to say in real life, some things are lying, means all bets are 
off. Does the sun come up every day, or does reality (or my senes) lie 
to me on occasion [ nasty trick to play an AI system - play a video 
tape of a sunrise at 2am at the equator - now is it going to assume 
there is some sort of exception to its existing observations (its 
normally a camera, not a video player connected), or is it goiung to 
think someone is lying to it [cf trick]]

How can you run logic in a system where some statements are false, and 
you are not allowed to collect sufficient information to eliminate 
false statements (contrasted with unproveable ones).

-- 
  Wayne McDougall, BCNU
  This .sig unintentionally left blank.

Hello! I'm a .SIG Virus. Copy me and spread the fun.


