From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!wupost!uunet!psinntp!scylla!daryl Thu Jan 16 17:20:09 EST 1992
Article 2686 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!wupost!uunet!psinntp!scylla!daryl
>From: daryl@oracorp.com
Subject: Re: Causes and Reasons
Message-ID: <1992Jan14.012301.19673@oracorp.com>
Organization: ORA Corporation
Date: Tue, 14 Jan 1992 01:23:01 GMT

Chris Malcolm writes:

> note that it is almost certainly inadequate simply to try to plug
> some sensors thru some signal->symbol translation box and pipe the
> output symbols into the creature's world model updating machinery.

I'm not exactly sure what you mean by "signal->symbol translation
box", but let me say what I think you mean. It is certainly not very
sensible to take the sensory input and translate it into English
statements (or are we talking about Chinese?). However, there is no
problem, conceptually, with representing signals with symbols. After
all, music from a CD is digital, and is thus representable as symbols,
even though it contains the information necessary to reproduce
auditory signals that are analogue. Given these additional inputs, one
can ask that the robot pass the appropriate Turing Test for this
enriched set of inputs (instead of simply words, as in the case for
the original Turing Test). If this is what Steve Harnad means by his
Total Turing Test, then I have some sympathy for that position.

However, when Harnad was last posting to comp.ai (or when I was last
reading it), he had the conviction that it was crucial that these
sensory inputs be analogue, rather than digital. That never made sense
to me, since digital signals with fine enough resolution
can be indistinguishable from analogue signals (to human beings, who have
limited signal resolution).

Daryl McCullough
ORA Corp.
Ithaca, NY



