From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!think.com!wupost!uunet!psinntp!scylla!daryl Thu Feb 20 15:21:28 EST 1992
Article 3804 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!think.com!wupost!uunet!psinntp!scylla!daryl
>From: daryl@oracorp.com
Subject: Re: Reference
Message-ID: <1992Feb17.180954.20426@oracorp.com>
Organization: ORA Corporation
Date: Mon, 17 Feb 1992 18:09:54 GMT

Michael Gemar writes:

> I am also not so sure that I would agree that it is simply up to human
> observers to decide whether a system is processing information.  If we
> are talking about Shannon and Weaver-type "information," sans semantic
> content (heck, without any content of any kind), then I think that
> "information processing" can be objectively defined independent of
> observers.  It's when semantics creeps in through the back door that I
> begin to complain.

I think the analogy with Shannon information theory is quite apt.
However, there are the same problems with interpretation with Shannon
information theory as there are with whether a physical system
implements a particular finite state machine.

Shannon's information theory is about the transmission of elements of
a discrete alphabet over a noisy medium. For real physical systems,
signals are analog, not discrete; in order to apply Shannon's theory,
it is necessary to convert the analog signal into a discrete signal.

Now, if the analog signal was constructed especially for
communication, as is the case for radio signals or the ethernet, then
there is a more-or-less obvious way to discretize the signal. The
discretization (is that a real word?) process consists of identifying
certain equivalence classes of analog signals (just as in handwriting
we identify many different shapes with the letter "A").

However, if we turn to "signals" that are not man-made---say the
shapes of clouds, or the timing of thunderclaps---we find that there
is no unique way to discretize these signals. We don't know which
signals are to be considered "variants" of the same signal (as the
different shapes are variants of the letter "A"), so we cannot even
identify what the alphabet is. Without identifying an alphabet, we
cannot apply Shannon's theory and get a unique answer as to what the
"bandwidth" of the communication channel is.

So, anyway, I disagree with your assertion. In an analog world (ours
is analog only if you ignore quantum mechanics), there is no unique
way to apply information theory to physical systems. I think the same
problem exists for asking what finite state machine a physical system
implements; depending on how picky you get about when two states are
considered different, you will get different answers.

Daryl McCullough
ORA Corp.
Ithaca, NY




