Newsgroups: comp.ai.philosophy
From: Lupton@luptonpj.demon.co.uk (Peter Lupton)
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!news.sprintlink.net!demon!luptonpj.demon.co.uk!Lupton
Subject: Re: Information And Entropy (models)
References: <39thav$p26@netaxs.com> <383svn$js9@galaxy.ucr.edu> <1994Oct20.214734.15940@forte.com> <9411041725.PN18337@LL.MIT.EDU> <39n4at$2kg@seagoon.newcastle.edu.au>
Distribution: world
Organization: No Organisation
Reply-To: Lupton@luptonpj.demon.co.uk
X-Newsreader: Newswin Alpha 0.6
Lines:  72
Date: Sun, 13 Nov 1994 19:57:04 +0000
Message-ID: <853716820wnr@luptonpj.demon.co.uk>
Sender: usenet@demon.co.uk

 
> John Collier (pljdc@alinga.newcastle.edu.au) wrote:
> : in 1962. Brillouin points out that Shannon information cannot be
> : entropy, since it can decrease when passed through a passive filter.
> : Entropy, on the other hand, cannot decrease spontaneously, except
> : by the wildest chance. Shannon's definition was unfortunate..

My previous posting was rather long and so probably skipped over.
The burden of that posting was that Shannon's Information is nothing
but the mean Algorithmic Complexity of that data (where the mean
is taken over the distribution which is part of the definiton of
an Information-Theoretic Source and/or Channel). 

If the Information of a signal reduces (due to a channel being
many-to-1) then we should expect there to be physical entropy
generated as a result - the channel will warm up. But not by much,
since most information flows are *tiny* in comparison to physical
entropy.


In article: <39thav$p26@netaxs.com>  sparky@netaxs.com (Tim Sheridan) writes:

> The context if information may be the problem here..  if one erases a 
> chalk board there is atill a lot of chalk and board so in some sense the 
> same amount of pure information..  but if you are only looking for 
> equations then you don't see any left on the board and would record a 
> loss of your particular kind of information.  But with physical reality 
> as the context one has a very different thing..

The use of the word 'Information' by Shannon(?) is very misleading.
Information as ordinarily understood involves the idea that one
state or process is about another state or process. This is what makes
Maxwell's Demon so interesting - here is a system in which the Demon
obtains information (in the everyday sense) and it does seem to be
related to mean Algorithmic Complexity (or information in Shannon's
sense). Let me say, however, that I wholly support Ken Colby's advice
to concentrate on Algorithmic Complexity. Algorithmic Complexity
clearly generalises Shannon's Information since it can handle 
individual sequences of data and, when applied to distributions,
gives the same result (as a mean). The transition from the mean to 
individual sequences of data is real progress!

> Information and entropy are both created through physical process.
> I.e an ice cube melting...  or a random number generator.. generating..
> 
> Though the two are different (i.e. meta rs physical) the process part 
> seems very related.
> 
> Now all this has gon very far from Consciousness.  But it is interesting 
> to attempt to grasp information for it's universal and mysterious 
> nature..  If information is so related to entropy and "process" in the 
> physical world then it might be capible of acting as a "carier of qualia"
> 
> "Qualia" made of information that represents it in conjunction with the 
> process of the encoder/decoder process that carries it?

There may well be some contribution to our understanding of 
consciousness that Algorithmic Complexity can bring. More directly,
however, Algorithmic Complexity is plainly related to our everyday notion 
of information. We want to say that if we know something (if we have 
information about something) then we have something which is 
redundant with respect to that thing. Algorithmic Complexity provides
a notion of redundancy which operates in individual cases (unlike
Shannon's Information, which operates only over entire 
distributions). It is the individual case we are usually concerned
with. Knowing that if you turn left and left again you end up at the
school, for example. I don't know what 'directions in the mean' 
amount to.  

Cheers,
Pete Lupton
