Newsgroups: comp.ai.philosophy,alt.consciousness.mysticism,alt.consciousness,alt.paranormal.channeling,talk.philosophy.misc,alt.pagan,alt.atheism,talk.religion.newage
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!news.sei.cmu.edu!cis.ohio-state.edu!math.ohio-state.edu!howland.reston.ans.net!spool.mu.edu!bloom-beacon.mit.edu!news.media.mit.edu!minsky
From: minsky@media.mit.edu (Marvin Minsky)
Subject: Re: rereRe: The end of god
Message-ID: <1994Oct7.163844.29829@news.media.mit.edu>
Sender: news@news.media.mit.edu (USENET News System)
Cc: minsky
Organization: MIT Media Laboratory
References: <36uvgd$33f@rpc28.gl.umbc.edu> <36vt2m$g6m@scapa.cs.ualberta.ca> <Cx9uJ1.2H4@nntpa.cb.att.com>
Date: Fri, 7 Oct 1994 16:38:44 GMT
Lines: 85

In article <Cx9uJ1.2H4@nntpa.cb.att.com> ka@socrates.hr.att.com (Kenneth Almquist) writes:
>Kevin Wiebe <kevin@swanlake.cs.ualberta.ca> wrote:
>> Godel...used logic to prove that ALL formal systems (ALL logic) is
>> INCOMPLETE.
>
>An INCOMPLETE system is a system that allows you to express propositions
>which can be neither proved nor disproved within the system.  Goedel's
>proof applies to formal systems which are:
>
>1) consistent (in an inconsistent formal system it is possible to prove
>   anything, but such systems are not very useful), and
>
>2) powerful enough to include arithmetic.
>						Kenneth Almquist

And perhaps it should be noted that there's nothing magical about
"arithmetic" in this context.  Godel's proof used a way to assign
numbers to logical expressions, so that one expression, by including
such a number, could be interpreted as referring to another
expression.  This made it feasible to write expressions that could, in
effect, be seen as referring to themselves -- and thus, you could
produce statements that could be seen as, sort of, "liar's paradox"
statements or "diagonal arguments".

Now Godel's trick was to assign various prime numbers to various parts
of an expression, and then multiplying them all together.  The heart
of the trick was to do this in such a way that you could get the
original expression back by factoring the number -- so it all depends
on the unique factorization into primes.  That's why it needed
"arithmetic" -- simply to be able to do that sort of reversible
compression.

Instead of (but equivalent to) this, you can use any formalism that
permits assigning a symbol to a list of symbols, in a reversible way.
John McCarthy showed how to do this using a list-processing language,
and thus produced a pretty clear one-page proof of Godel's theorem.
(You have to write a LISP program to represent the steps of a proof,
but that isn't hard.)

Raymond Smullyan showed an even simpler way, by using a trick that
represented a sequence of symbols by a (base 3) number in which the
encoding used concatenating instead of multiplying!  So decoding the
number required merely looking at its digits, instead of having to
factor it.  This leads to rather short proofs, but ones that I find
harder to understand.

So I think a better way to describe the condition for Godel's theorem
is not to say, rather obscurely, that if F is the formal system,
	"F can express arithmetic", but
	"F can represent expressions by symbols", in such a way that
you can write expressions that can be interpreted as referring to
themselves, so that you can represent (Cantor-like) diagonal arguments."


To me, this all seems to say that no formal system can have
simultaneously the three properties:

1. it is logically consistent.
2. is is expressive enough for commonsense purposes.
3. it can express proofs of all true statements it can express.

So what?  Penrose, Lucas, and all those other losers conceal an
assumption that good mathematicians can be consistent and still
recognize truths even when they they can't prove them.

In commonsense parlance, this is called "guessing"!  The hidden
assumptions are (1) that, as mathematicians, they never guess wrong.
This is simply false; indeed, in this domain, they seem almost always to
guess wrong.  The second assumption is that no machine can be
programmed to "guess".  This is downright silly.  

The whole thing is silly. It's easy to write heuristic programs that
make pretty good guesses that are often wrong.  Penrose, Lucas, and
the rest of them appear to be assuming that a human mathematician has
some sort of "non-algorithmic" inside track that reveals to them
unprovable truths.  Then they use this assumption to "prove" that they
are different from machines -- which (therefore) can't think, or be
conscious, or intuitive, etc.

It is silly, because there is no proof at all.  They simply assume
what they purport to prove. 




