Newsgroups: sci.physics,sci.skeptic,alt.consciousness,sci.psychology,comp.ai.philosophy,sci.bio,sci.philosophy.meta,rec.arts.books,rec.arts.sf.science
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!lard.ftp.com!bloom-beacon.mit.edu!galois!bondal
From: bondal@math.mit.edu (Alexei Bondal)
Subject: Re: Information And Entropy (models)
Message-ID: <1994Nov12.022928.24741@galois.mit.edu>
Sender: usenet@galois.mit.edu
Nntp-Posting-Host: kronecker
Organization: MIT Department of Mathematics
References: <39bnop$1d1@news-rocq.inria.fr> <39orvf$abm@news-rocq.inria.fr> <39r32g$fth@oahu.cs.ucla.edu> <KJM.179.000C0CD2@mfs1.ballarat.edu.au>
Distribution: inet
Date: Sat, 12 Nov 94 02:29:28 GMT
Lines: 32
Xref: glinda.oz.cs.cmu.edu sci.physics:99943 sci.skeptic:94980 sci.psychology:29519 comp.ai.philosophy:21949 sci.bio:23037 sci.philosophy.meta:14683

In article <KJM.179.000C0CD2@mfs1.ballarat.edu.au>,
Kevin Moore <KJM@mfs1.ballarat.edu.au> wrote:
>In article <39r32g$fth@oahu.cs.ucla.edu> colby@oahu.cs.ucla.edu (Kenneth Colby) writes:
>
>>    Before a scientific community, sharing a lexicon of kind-terms,
>>    can arrive at consensus on a theory, it must have consensibility,
>>    i.e. aggreement on the meaning of terms and their concepts. We are
>>    currently burdened with two "entropies" leading to confusion. 
>
>Only two? Let's see now:
>
>There's the thermodynamic entropy, Gibbs, Boltzmann, Shannon and Kolmogorov 
>entropies for a start, all different, although the Gibbs is a special case of 
>the Shannon entropy, and can be shown equivalent to the thermodynamic entropy 
>if the underlying model of the system is assumed to be correct. I've also seen 
>an entropy defined for continuous distributions, although some authors 
>dispute its validity, so that's six of them. Anybody got any others?
>

But of course: the topological (Adler) entropy (a generalization 
of Shannon's noiseless channel capacity), Connes-Pimsner-Popa 
entropy (subfactors of von Neumann factors), Perron numbers, 
Liapunov exponents, etc.

What do they all have in common? Well, they all somehow or other
indicate the (exponential) growth rate. Under certain conditions,
some of them are equivalent.

Michael Abalovich



