Newsgroups: sci.physics,alt.sci.physics.new-theories,alt.consciousness,sci.cognitive,comp.ai,sci.philosophy.tech,sci.skeptic
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!uunet!in1.uu.net!psinntp!scylla!daryl
From: daryl@oracorp.com (Daryl McCullough)
Subject: Re: Review of Shadows of the Mind
Message-ID: <1995Mar31.020702.5633@oracorp.com>
Organization: Odyssey Research Associates, Inc.
Date: Fri, 31 Mar 1995 02:07:02 GMT
Lines: 143
Xref: glinda.oz.cs.cmu.edu sci.physics:115737 sci.cognitive:7103 comp.ai:28660 sci.philosophy.tech:17341 sci.skeptic:108578

Geez, are all these newsgroups necessary?

ghrosenb@phil.indiana.edu (Gregg Rosenberg) writes:

>> And, as Ken points out, it holds to Roger Penrose, as well. The "Godel
>> formula" for Penrose is:
>> 
>>     G(Penrose) =
>>     "This statement is not considered an unassailable truth by
>>      Roger Penrose."
>
>This is not true, really.

Yes, it is. Given a language L, and given a theory T in that language,
a Godel statement for T is a statement G such that G holds if and only
if G is not a theorem of T. If we let the language be English, and let
the theory T be the collection of English statements that Penrose
would consider unassailably true, then G(Penrose) above counts as a
Godel statement. G(Penrose) holds if and only if Penrose does not
consider it unassailably true.

>The Godel statement for whatever K might underlie Penrose's
>mathematical understanding is a perfectly ordinary statement of
>arithmetic.

Yes, on the assumption that Penrose' thinking processes are completely
described by formal system K, a Godel statement for Penrose can be
constructed which is a statement of pure arithmetic. Essentially, what
this G would amount to is the translation of the G(Penrose) above into
arithmetic.

But it is unnecessary to assume that Penrose' beliefs are
computable. Regardless of whether they are computable, statement
G(Penrose) above can be used to show that Penrose cannot be consistent
and also be convinced of his consistency.

>For example, 'not-G(45)', meaning
>the no syntactic derivation exists, in the system, of the formula
>whose Godel numbering equals 45. It's got nothing to do with Penrose, 
>on the face of it. It is just an ordinary, run-of-the-mill, not tricky
>or self-referential at all, statement of number theory. It just so happens 
>that because we know the formula encoded by the '45', and we know what the
>predicate 'G(_)' is supposed to *mean*, we know this statement must be 
>true of any consistent system. That means we have an instance of a statement 
>of *number* theory, which we can see is true, but which cannot be proven 
>in the system.

Being able to code Penrose' beliefs into number theory is a red
herring. It isn't simply formal systems that are incapable of
"knowing" that they are consistent---*no* consistent system can know
it is consistent, whether it uses quantum theory, or intuition, or
psychic powers, or whatever.

>> Conclusion: If Roger Penrose is consistent, then the fact that he is
>> consistent is not one of his unassailable beliefs.

>> 
>> Penrose' mistake is in thinking that Godel's incompleteness theorem
>> only applies to computable theories. It applies to a much larger
>> collection of theories than that. Basically, no consistent theory
>> (whether or computable or not) can have as a consequence that it is
>> itself consistent.

>I think that this is very interesting, and maybe even right. The problem
>is to make it stick by saying just what form this larger class of theories
>has, and how it does apply. That is difficult, because one wants to
>*derive* the result to prove it. But, from the premise that the class
>of theories is not formal, it follows that no such strict proof can
>be given. It will have to be informal, in the sense in which Godel's
>proof itself is informal. But no less convincing for that reason.

Any theory (collection of statements closed under logical deductions)
whether computable or not which is capable of (1) encoding its own
syntax and (2) expressing theoremhood, will be incomplete in the sense
of Godel's theorem: if the system is consistent, it will be impossible
for the system to prove that it is consistent (that is, the statement
coding the fact that it is consistent will not be a theorem). The
informal reason is that it is possible for any such theory to come up
with a statement G such that G <-> G is not a theorem. Here is one 
example of noncomputable theories for which Godel's incompleteness
theorem applies:

       In the language of set theory, take as axioms ZFC + all
       true statements of arithmetic (interpreted as statements about
       finite ordinals). This theory is not computable, but it is
       possible to show that it is either inconsistent or incomplete,
       and it is possible to show that if it is consistent, it cannot
       prove its own consistency.

Penrose claims to be just rejecting the idea that the brain is
computable, and *not* to be rejecting the idea that there can be a
mathematical theory of the operation of the brain. However, his
arguments, if sound, would imply that *no* mathematical theory can
describe the behavior of the brain, because any mathematical theory
can be used to give a mathematical definition of "unassailable
reasoning", which would make Godel's incompleteness theorem
applicable.

Actually, even without a mathematical theory of the brain, we can show
that the collection of English language statements considered
unassailably true by Penrose must be either inconsistent or 
incomplete---if it is a consistent collection, then the conclusion
"Our unassailable reasoning is consistent" *cannot* itself be an
unassailable truth.

>Anyway, I think the fact that Penrose goes through soundness to arrive
>at consistency makes it difficult to refute in this sense. The problem
>is, to back-up the claim that we do not know we are consistent, one must
>give an argument that we do not have an adequate grasp of mathematical
>truth. That kind of argument can send us on the slippery-slope to destroying
>mathematical knowledge, period, and from there, lots of other areas. If
>possible, I would prefer to keep knowledge. As I said earlier, I think
>a better strategy is to admit that we are consistent (in Penrose's
>sense), and that we know it. It does not follow that we know whatever 
>algorithm underlies our mind is consistent, since we do not know that
>algorithm (only its consequences under a semantic guise). The question
>is: were we to become aware of that algorithm, in what sense could we
>come to 'know' that it was *ours* or, even independently, that it was
>*sound*. That takes us into interesting areas. Several senses of 'knows'
>end up being at work in the argument, for instance, and it is not clear
>if any old sense will do for Penrose. Similarly, if AI takes the line
>that we could not, even in principle, know the soundness of the
>algorithm, interesting questions arise as to just what *kinds* of algorithms
>might meet that condition. It is not obvious that mere complexity would
>allow an algorithm to escape the net. Personally, I think this spells
>doom for any LOT-type program such as CYC. Such things might have practical
>uses, but can't be taken seriously as models of human mentality.
>
>>> The problem is that we *don't* know that we are in principle
>>> consistent. Actually, some of us *think* they know but they are
>>> wrong---those people who think they are consistent actually aren't.
>>> 
>>> Daryl McCullough
>>> ORA Corp.
>>> Ithaca, NY
>
>Thank-you very much for your post. Your point is well-taken. I'd like
>to see it made into something hard-and-fast that could really stick.
>
>
>
>
>
