From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!sarah!cook!psinntp!psinntp!scylla!daryl Mon Mar  9 18:36:01 EST 1992
Article 4337 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!sarah!cook!psinntp!psinntp!scylla!daryl
>From: daryl@oracorp.com (Daryl McCullough)
Subject: The Systems Reply I
Message-ID: <1992Mar6.185926.18497@oracorp.com>
Organization: ORA Corporation
Date: Fri, 6 Mar 1992 18:59:26 GMT

             The Systems Reply (as I understand it)

In brief summary, the Chinese Room thought experiment has a human
being locked inside a room executing by hand a program for
"understanding" Chinese. By following the instructions in the program,
the man produces the appearance of understanding Chinese, but he
doesn't really understand, since the rules are purely syntactic. The
Systems Reply is that (a) there is understanding in the Chinese Room,
and (b) it is not the human being, but the *system* who understands.
Searle's response is to let the human memorize the rules, so that he
*is* the system, and he still doesn't understand Chinese.

>From the point of view of computationalism, Searle's response is
completely off the mark. Searle assumes the following chain of
reasoning:

   One physical body => One system => One mind 
   => The human doesn't understand => The system doesn't understand.

The very first step, that one physical body implies one system, is
simply false, and illustrates that Searle doesn't really have any idea
what a "system" is (or at least he doesn't understand what is meant by
"system" in the Systems Reply).

So what is a system? According to computationalism, a system is
determined by a functional relationship among three things (1)
information coming from the environment, (2) responses coming from the
system, and (3) state information, or memory, containing information
about the past. In other words, a system has inputs, outputs, memory,
and dynamics connecting them.

Although systems require material bodies, they are not identified with
those bodies. For this reason, it is possible to change the matter
that makes up a person's brain, without changing the person's mind.
(Some people never change their minds 8^)

Without getting into too much detail about the definition of a system,
I would like to point out some consequences of the notion of a system
as an information processor:

   1. More than one system can exist in a single physical body.

A system is not determined simply by a physical body, but by a
physical body together with a choice of interface (what constitutes
the inputs and outputs). For every such choice, there is a different
system.

   2. One system can be implemented "on top of" another system.

That is, you can obtain one system from another system by choosing a
particular starting state, and by choosing a restricted interface.

   3. State information is not part of a system unless that information
      has a functional role in future behavior of the interface for that
      system.

This criterion tells what parts of a physical object are to be
considered part of the system.

Now, all this can be made more precise, and has been; it is standard
stuff in the theory of processes. So far, there nothing has been said
that isn't true by definition of what is meant by a system. However,
the one claim made by computationalism that isn't true by definition is
the following:

  Mental properties (including consciousness and understanding) are
  properties of systems.

In other words, a mind *is* a system. This is the core of the
disagreement between Searle and the computationalists; the question is
whether a system is the sort thing that can be said to possess mental
properties, and I'm sure Searle would say no (and give his syntax does
not equal semantics argument). However, with this understanding of the
word "system", Searle's rebuttal to the Systems Reply has no force,
whatsoever.

Daryl McCullough
ORA Corp.
Ithaca, NY


