From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!psinntp!psinntp!scylla!daryl Tue Feb 11 15:25:25 EST 1992
Article 3553 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!psinntp!psinntp!scylla!daryl
>From: daryl@oracorp.com
Subject: Re: Multiple Personality Disorder and Strong AI
Message-ID: <1992Feb6.132111.9087@oracorp.com>
Organization: ORA Corporation
Date: Thu, 6 Feb 1992 13:21:11 GMT

Michael Gemar writes: (in response to Drew McDermott)

>> To get back to the puzzle: Consciousness is not a mass phenomenon.  If
>> the whole network maintains a model of itself as conscious, it is
>> conscious.

> Um...so what's lookin' at the model?

I took Drew's statement as meaning the network.

> And how does it *know* it's got one?

Do you mean "How does the network know that the model is a model of
itself?" It doesn't figure it out, any more than you figure out that
you are conscious; it is constructed so that it knows certain
primitive facts.

> Honestly, when I see explanations like the above, I want to jump up
> and down and yell "The Emperor has no clothes!"  Recursion and
> self-reflection are *not* explanations, Hofstadter to the contrary.

Hofstadter didn't invent the idea that self-reflection is an important
part of consciousness; it is quite a common belief that the lack of
self-reflection prevents computers (and perhaps some animals from
being conscious). I take you that you don't think it is important?

Anyway, Hofstadter simply was showing that, contrary to what you may
think at first, it *is* possible, using recursion, to get
self-reflection in a computer program. It may seem that to get
self-reflection, a computer program would need an copy of itself
(which would itself require a copy of a copy, and so on ad infinitum).
Recursion allows a way out of the infinite regress; the seemingly
infinite levels of copies within copies, etc. can be encoded in a
finite algorithm.

Daryl McCullough
ORA Corp.
Ithaca, NY






