From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!news.cs.indiana.edu!spool.mu.edu!think.com!samsung!uunet!kbsw1!chris Mon May 25 14:06:32 EDT 1992
Article 5776 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!news.cs.indiana.edu!spool.mu.edu!think.com!samsung!uunet!kbsw1!chris
>From: chris@kbsw1 (Chris Kostanick 806 1044)
Newsgroups: comp.ai.philosophy
Subject: Re: Grounding: Real vs. Virtual
Message-ID: <1992May20.170019.26095@kbsw1>
Date: 20 May 92 17:00:19 GMT
Article-I.D.: kbsw1.1992May20.170019.26095
References: <60703@aurs01.UUCP> <78417@netnews.upenn.edu>
Reply-To: chris@kbsw3.UUCP (Chris Kostanick 806 1044)
Organization: Kentek Information Systems
Lines: 12

The argument that a robot with tranducers is fundementally different
than a robot program interacting with a virtual world sounds wonky to me.
The transducer input is just a number, (or a bunch of them for a CCD array)
and the input from the virtual world is just a number. A big part of 
developing military sensor systems (FLIR, cruise missiles, that sort of
thing) is building a system to feed the software the kind of thing that
it expects. That lets you debug the software. The mil systems seem to
be adequately grounded in that they hit the target a good bit of
the time.

Chris Kostanick
"Load sabot, SHOOT!"


