Newsgroups: comp.ai.fuzzy
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!gatech!howland.reston.ans.net!ix.netcom.com!netcomsv!uu3news.netcom.com!netcomsv!uucp3.netcom.com!zygot!dlb!megatest!djones
From: djones@Corp.Megatest.COM (Dave Jones)
Subject: Re: Q: How do we measure fuzziness?
Message-ID: <D5ro1D.7LE@Corp.Megatest.COM>
Organization: Megatest Corporation
References: <199503150537.AA15782@marlin.jcu.edu.au>
Date: Tue, 21 Mar 1995 01:27:09 GMT
Lines: 28

From article <199503150537.AA15782@marlin.jcu.edu.au>, by Michael.Smithson@jcu.edu.au:
>    Examples of the first kind (max fuzziness at 1/2) include:
> (a) Sum of the Hamming or Euclidean distance between the fuzzy set and the
> nearest crisp set (Kaufmann 1975),
> (b) Sum(-u*lnu - (1-u)*ln(1-u) (Entropic measure due to De Luca & Termini
> 1972),
> (c) Sum[min(u,1-u)]/Sum[max(u,1-u)] (Entropic measure due to Kosko 1992).
> 
 
The addend of the second one is related to Shannon entropy, and is used
in a slightly rewritten form by J.L. Kelly Jr. in "A New Interpretation
of Information Rate", _The Bell Systems Technical Journal_, July 1956,
page 920.

With the addition of 1, he calls the value "the rate of transmission as defined
by Shannon", and lists C.E. Shannon and R.E. Graham as having assisted in
the preparation of the paper.

I have a good intuitive grasp of how Shannon entropy measures average 
confusion, and how it relates to information transmission rate (as
per the formula above) and and also how it relates to bankroll growth
in gambling (as per "the Kelly criterion"). Could you elaborate just a bit
on the other two formulas above? I can't seem to form any intuitive idea
of what they measure.


           Thanks,
           Dave
