Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!news.sprintlink.net!news.bluesky.net!solaris.cc.vt.edu!news.mathworks.com!uhog.mit.edu!bloom-beacon.mit.edu!eru.mt.luth.se!news.luth.se!sunic!liuida!c89ponga
From: c89ponga@ida.liu.se (kand. Pontus Gagge)
Subject: Re: Dennett versus Searle
Message-ID: <1995Feb10.192629.642@ida.liu.se>
Sender: news@ida.liu.se
Organization: CIS Dept, Univ of Linkoping, Sweden
References: <3h8p6e$40t@sunserver.lrz-muenchen.de>
Date: Fri, 10 Feb 1995 19:26:29 GMT
Lines: 48

ua352af@sun2.lrz-muenchen.de (Michael Pietroforte) writes:

[Omitted]

>On the other hand, he concedes (p. 225) that a >von Neumann
>machine, by being WIRED UP from the outset that way< (my emphasis)
>is entirely unconscious, whereas a virtual von Neumannesque
>machine implemented in the parallel architecture of the brain is
>sufficient for consciousness (p. 210).
[Omitted]
>Is there really a condtradiction or is my reading of Dennett 
>incorrect?

He is, perhaps, somewhat unclear in the cited passage(s). What he
probably means is that *a* von Neumann machine (say, one that 
calculates the sum of two numbers) is certainly unconscious, even
though it performs operations which *for us* are at a high level
and involve whatever faculties we like to call "conscious".
However, this is not inconsistent with the idea that *some*
sufficiently complex TM could simulate (and thus possess)
consciousness; however, the TM description would probably
not be very illuminating: all the interesting stuff (such as
a virtual "Joycean machine") would happen at higher levels of 
abstraction, not at the lowest level of primitive machine 
instructions. (Yes, talking about "levels" and "higher/lower"
is vague, but is the best we/I can do right now.)

He further inclines towards the (not unreasonable) view that, for
*practical* purposes, true AI cannot come about without massive
parallelism, which perhaps muddles the issue further. The
practical necessities do not contradict the claim that such a
machine could be simulated by a linear process such as a TM.
The claim is only of mathematical/philosophical interest, not
at all practical.

Thus: the abstraction levels of the brain's "Joycean machine" and
the mathematical description of computers as Turing machines 
are not really commensurable; this is the trap of the Chinese
Room Gedankenexperiment.

>-------------
>Michael Pietroforte
>ua352af@sunmail.lrz-muenchen.de

--
/--- Ego sum --------\ /------------------------\
! kand. Pontus Gagge  ! c89ponga.und.ida.liu.se !
\---- Enjoyment is an overrated pleasure. ------/
