From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!cs.utexas.edu!uunet!math.fu-berlin.de!news.netmbx.de!Germany.EU.net!mcsun!sunic2!seunet!kullmar!pkmab!ske Mon May 25 14:07:25 EDT 1992
Article 5868 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!cs.utexas.edu!uunet!math.fu-berlin.de!news.netmbx.de!Germany.EU.net!mcsun!sunic2!seunet!kullmar!pkmab!ske
>From: ske@pkmab.se (Kristoffer Eriksson)
Newsgroups: comp.ai.philosophy
Subject: Re: Grounding: Real vs. Virtual (formerly "on meaning")
Keywords: symbol, analog, Turing Test, robotics
Message-ID: <6904@pkmab.se>
Date: 23 May 92 13:00:06 GMT
References: <1992May19.003821.9450@Princeton.EDU>
Organization: Peridot Konsult i Mellansverige AB, Oerebro, Sweden
Lines: 42

In article <1992May19.003821.9450@Princeton.EDU> harnad@shine.Princeton.EDU (Stevan Harnad) writes:
>A computer simulation of an analog object or state is not the same
>as that object/state despite the fact that it is computationally
>equivalent to it: A simulated plane does not really fly, a simulated
>furnace does not really burn, there is no real motion in a simulated
>solar system; by the same token, there is no real thinking in a 
>simulated nervous system. Computational equivalence is not the same
>as identity.

You are overlooking something. There is nothing about simulation that
says that if you simulate an instance of X, the simulation can under
no circumstances itself also satisfy the definition of being another
instance of X.

To take an obvious example, suppose we simulate the execution of another
computer program, and we do this on a computer, then the simulation is
also the execution of a computer program.

Or, to take it a step further, suppose we simulate a program that
computes PI, then the simulation will also be a program that computes
PI.

Or suppose that I, in person, "simulate" behaving weird. What will I do.
Well, I will behave weird!

The point is; you can not just proclaim that "there is no real thinking
in a simulated nervous system", especially not just "by the same token",
without first showing that "thinking" does not refer to, for instance, a
particular computation or behavior that the simulation will be performing
too. And, of course, the AI side's view on thinking, is exactly that
thinking IS a computational process. You have thus not managed to elicit
any inconsistency in the AI view.

A further point following from the assumption that thinking is a compu-
tational process, is that it could also be implemented directly, without
going through a simulation of an actual human brain. For such objects, we
are NOT limited to simulation only; we can have the "real" thing too.

-- 
Kristoffer Eriksson, Peridot Konsult AB, Hagagatan 6, S-703 40 Oerebro, Sweden
Phone: +46 19-13 03 60  !  e-mail: ske@pkmab.se
Fax:   +46 19-11 51 03  !  or ...!{uunet,mcsun}!mail.swip.net!kullmar!pkmab!ske


