From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!cs.utexas.edu!uunet!psinntp!scylla!daryl Mon Mar  9 18:34:14 EST 1992
Article 4169 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!cs.utexas.edu!uunet!psinntp!scylla!daryl
>From: daryl@oracorp.com
Subject: Re: Strong AI and panpsychism
Message-ID: <1992Feb29.143406.20009@oracorp.com>
Organization: ORA Corporation
Date: Sat, 29 Feb 1992 14:34:06 GMT
Lines: 25

bhw@aifh.ed.ac.uk (Barbara H. Webb) writes:

>>What I mean by implementation is pretty simple: A implements B if
>>there are functions mapping states of A to states of B and inputs and
>>outputs of A to inputs and outputs of B such that the mappings
>>preserve the transition relations.

>Surely this relationship of A and B is sufficient only for _simulation_.
>Implementation requires (at least) an _identity_ of inputs and outputs
>of the two systems, rather than a mere mapping between them.

You are right, that requiring identity of inputs and outputs solves a
lot of the problems. However, it may be too strict a requirement.
Consider a person who has lost his hearing. He or she can learn to
communicate using sign language, which is a different (but
approximately isomorphic, at least if the person uses Signed English)
set of input and output signals. Does the person now implement a
different system?

Daryl McCullough
ORA Corp.
Ithaca, NY





