From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!sun-barr!ames!elroy.jpl.nasa.gov!swrinde!mips!think.com!snorkelwacker.mit.edu!news.media.mit.edu!minsky Tue May 12 15:49:57 EDT 1992
Article 5510 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!sun-barr!ames!elroy.jpl.nasa.gov!swrinde!mips!think.com!snorkelwacker.mit.edu!news.media.mit.edu!minsky
>From: minsky@media.mit.edu (Marvin Minsky)
Newsgroups: comp.ai.philosophy
Subject: Re: penrose
Message-ID: <1992May8.015202.10792@news.media.mit.edu>
Date: 8 May 92 01:52:02 GMT
References: <2524@ucl-cs.uucp> <1992May1.025230.8835@news.media.mit.edu> <1992May6.220605.26774@unixg.ubc.ca>
Sender: news@news.media.mit.edu (USENET News System)
Organization: MIT Media Laboratory
Lines: 98
Cc: minsky

In article <1992May6.220605.26774@unixg.ubc.ca> ramsay@unixg.ubc.ca (Keith Ramsay) writes:
>In article <1992May1.025230.8835@news.media.mit.edu>
>minsky@media.mit.edu (Marvin Minsky) writes:
>> The math seems generally OK, but the stuff on universal Turing machines seems
>>amateurish.  He either did not know, or neglected to point out that
>>there are known to be very small Universal Turing Machines (e.g, 4
>>symbols, 7 states).  
>
>Is there some special significance to this fact (so that one would
>make a special point of including it)?

Yes indeed, because Penrose book is permeated by insinuations to make
the naive reader feel that people are not machines -- and every single
one of these insinuations is based on some sort of wrongness.  I am
angry because *anyone* could collect a lot of defective arguments and
then insinuate that "where there's smoke, there's fire".
Specifically, Penrose presents a several hundredd-digit Godel-like
number for his universal Turing machine and then says:
  "This number no doubt seems alarmingly large!  Indeed it *is*
   alarmingly large but I have not been able to see how it could be
   made significantly smaller. ... One is inevitably led to a number
   of this size for the coding of an actual universal Turing machine."

Well, most of Penrose proofs have this form.  The special significance
is that he has made you think that UTMs are so complex or something
that a brain must be something else.  If he did not have some such
purpose, then why did Penrose make such a special point of inculding
it?

>
>>  So far as I can see, Penrose's discussion about Godel's theorem
>>depends on making peculiar assumptions about (1) that humans have
>>magical abilities to recognize mathematical truths
>
>I think it is a little more subtle than that. Penrose (incorrectly,
>IMO) concludes the non-computable abilities from our ability to apply
>Godel's reasoning to arbitrary formal systems. It is more a mistake
>than an untoward assumption.
>
>>and (2) that the
>>Turing machines aren't allowed to generate new Turing machines that
>>use different sets of axioms.
>
>I'm not sure I see the relevance of this. One can emulate a
>deterministically evolving "community" of Turing machines using just
>one Turing machine. The theorems of the community are then just
>theorems generated by one machine, but tagged by "source". How does
>this help?

Are you saying that "a mistake" is better or worse than "an untoward
assumption"? I'm complaining that *everything" in "The Emperor's New
Book" is one or the other when it comes to its main thesis that the
brain/mind is non-algorithmic.  But I guess I wasn't very clear here.
I should have emphasized that Penrose simply failed to realize that
there could be TM's that compute the consequences of *inconsistent*
sets of axioms.  This is a dreadful oversight because that is in fact
(I assert) precisely what brains do.  Now the whole discussion is
preparation for talking about how machines will be "balked", should I
say, by Godel's theorem, whereas people won't. 
  
Now here's one of the things worng with what follows.  Penrose
presents a proof of Godel's theorem, shows how to make a godel
sentence (with no proof in the system) and then uses "Insight" to
argue that the sentence must be True.  Then, of course, we can append
this as an axiom of a large system, etc., ad infinitum.  Then folllows a
song-and-dance about how these insights cannot be "systematized" by
any algorithm.  I cannot follow this argument, which is based on a new
term called "meaning".  Penrose cautions that these "reflection
arguments" are dangerous, and that "one must always be careful when
using them" lest one get into some sort of Russell-type paradox.

  Wow, gee.  Now "being careful" becomes part of the thesis!  Well,
van Quine was awfully careful when he formulated "new foundations".
And Barkley Rosser found the mistake.  The system was in fact
inconsistent.  The evidence is that Penrose is not nearly so careful
as Quine, in my view.  He is undoubtedly using a "system" of
reasoning" that is inconsistent!  And so are we all!  SO here's my
interpretation of Godel's theorem.

  Any formal system that is (1) powerful enough to express Arithmetic
and also (2) is consistent will be too weak to support the familar
forms of human commonsense reasoning -- particularly in regard to
"reflexive" thought in which a person thinks about his/her own
processes.  And this is indeed a pity.  Because it means that in order
to make machines that appear to be reasonably smart, one must use the
(apparently) necessary short cuts (or heuristics) of self-reference.
There is no problem at all about making machines do such things;
however, when we interpret ("meaning, again") what they are doing in
the appropriate way it will turn out, of course, that they are using
an inconsistent logic.  (Any kid can be led into asserting Russell's
paradox.)  All of Penrose song and dance is based on overlooking this
possibility! You can call it a wrong assumption or a mistake; I call
it a dogmatic act of faith; he (and Lucas and all the rest) are so
invested in believing that they're more than physical that they can't
see the structure of their motivations.

Well, I've been meaning to say this since college days, but I never
got quite stirred up enough.  Thanks!


