Newsgroups: comp.ai.philosophy
From: ohgs@chatham.demon.co.uk (Oliver Sparrow)
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!news.sprintlink.net!peernews.demon.co.uk!chatham.demon.co.uk!ohgs
Subject: Re: What makes up consciousness?
References: <departedD3vKy5.M3B@netcom.com> <792697161snz@chatham.demon.co.uk> <departedD3zp0E.Es9@netcom.com>
Organization: Royal Institute of International Affairs
Reply-To: ohgs@chatham.demon.co.uk
X-Newsreader: Demon Internet Simple News v1.27
Lines: 55
X-Posting-Host: chatham.demon.co.uk
Date: Wed, 15 Feb 1995 08:26:36 +0000
Message-ID: <792836796snz@chatham.demon.co.uk>
Sender: usenet@demon.co.uk

Nice one, Wesson Sir.

With reference to your "eliminate the homulculus" point, I think that you 
are completely correct. If there are hierarchies (rather than *a* hierarchy)
then - as you say - the problem goes away. We note the intersection of 
hierarchies everywhere: how a bunch of flowers got to be on a table can
employ any number of "explanations" - from biochemistry through social 
science to John-loves-Mary; and they are all of them true in the sense that 
the connect streams of ordering which change behaviour in predictable ways.
The further one gets from the intersection of the moment, the less related
the hierarchies may become, whilst retaining their internal integrity.

One thought that I advanced here once before - and where it did the lead
balloon trick, as does so much of what one says on Internet - is that of
levitating systems. Consider a functionality - some subset of taste or     
vision, let us say - and call it A. For A to be able to function, it calls on 
the product of an indepenedent system, called B. The independence of B is 
guaranteed by two facts: that it does things other than serve A's needs and it 
accpets no input from A. It does, however, rely upon C, a third subsystem; 
which is independent of B but - oh lord, oh lord - draws upon the product of A.
Thus B is a part of the "explanation" of A, C is a contributory factor of for
B and C is made out of parts which include A. To capture what is going one,
one needs to go "up" a layer and see the A-B-C complex as a system, having 
emergent properties which are a thing-in-themselves, levitating in information 
space.

Finally: NNs interhooked and mutually learning do indeed have remarkable
properties. One trick is that they can invent principle components of a data 
field and train themselves to become PC detectors. There was a paper in 
Science last year which reported a set up which had two TV cameras, each 
focused on a square grid and each feeding an independent NN. The standard 
deviation (I think) of the output layer signal of each NN was used as the 
training signal for the hidden layer of the other. When cold booted, the 
ensemble learned to detect either horizontal, vertical or gridded lines, 
falling towards on of the three attractors (sorry, Dr Rickert) in the system. 
Which emerged was unpredictable, but the ensemble thereafter was a reliable PC 
detector of other scenes in which one had a need to discover bars, grids or 
whatever.

The point of this is that a bare-bummed system had learned to discover 
components of its data environment without a prompt other than being placed 
high on a Boltzman surface. Brains are, presumably, thus and so; and the bars 
and grids which they learn to detect pass, as you rightly indicate, up the 
hierarchy to be the substrate on which higher order systems operate. One can 
see how it all stitches itself together in order to be able to spot Big Red 
Triangles Which Have Green Interiors, but not how it comes to discriminate 
between viable features and useful features of the environment. This last is 
where the cross-chain associations occur, and where you very rightly point out 
that we have a system of recursive self-referentiality, such that there is no 
core Thing to the tune of which everything dances.

_________________________________________________

  Oliver Sparrow
  ohgs@chatham.demon.co.uk
