Newsgroups: comp.ai,comp.ai.philosophy,comp.ai.alife
Path: cantaloupe.srv.cs.cmu.edu!rochester!cornellcs!newsstand.cit.cornell.edu!portc01.blue.aol.com!news-peer.gsl.net!news.gsl.net!hunter.premier.net!news.nl.innet.net!INnl.net!feed1.news.erols.com!tezcat!news.bbnplanet.com!cam-news-hub1.bbnplanet.com!news.mathworks.com!newsfeed.internetmci.com!news.webspan.net!ix.netcom.com!netcom.com!jqb
From: jqb@netcom.com (Jim Balter)
Subject: Re: rand() - implementation ideas [Q]
Message-ID: <jqbE0sw9B.KM9@netcom.com>
Followup-To: comp.compression
Organization: NETCOM On-line Communication Services (408 261-4700 guest)
References: <54lr8o$ndm@nntp.seflin.lib.fl.us> <327e9a12.0@news.iea.net> <jqbE0Dvyr.A9H@netcom.com> <328958B2.4AC3@cu-online.com>
Date: Wed, 13 Nov 1996 08:43:58 GMT
Lines: 24
Sender: jqb@netcom23.netcom.com
Xref: glinda.oz.cs.cmu.edu comp.ai:42130 comp.ai.philosophy:48696 comp.ai.alife:6896

In article <328958B2.4AC3@cu-online.com>,
Pablo H. Mayrgundter <pablo@cu-online.com> wrote:
>This isn't much to do with this group, but there's a bit of space
>around.  That will serve to justify a bit of rambling (or entice the
>sharks!!)
>
>Aren't there some considerations of s/n ratio when talking about
>compressing information (a la info theory).  I grant that I'm out of my
>depth by far here.  But I remember something about the amount of
>information actually transimitted by a signal can be expressed as a
>probability.  That probability is related to the s/n ratio.  Would lossy
>compression, or guessing increase the probability of correct
>information.  Or something like that?
>
>Furthermore, doesn't that highlight the difference between a grape?
>
>
>1qazs3edc5tgb7um

Followup-To: comp.compression

-- 
<J Q B>

