From newshub.ccs.yorku.ca!torn!cs.utexas.edu!qt.cs.utexas.edu!yale.edu!jvnc.net!princeton!phoenix.Princeton.EDU!harnad Fri Sep  4 09:41:03 EDT 1992
Article 6700 of comp.ai.philosophy:
Xref: newshub.ccs.yorku.ca comp.ai.philosophy:6700 comp.ai:4069
Newsgroups: comp.ai.philosophy,comp.ai
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!qt.cs.utexas.edu!yale.edu!jvnc.net!princeton!phoenix.Princeton.EDU!harnad
>From: harnad@phoenix.Princeton.EDU (Stevan Harnad)
Subject: Virtual Symposium on Virtual Mind
Message-ID: <1992Aug25.213821.4522@Princeton.EDU>
Followup-To: sender
Summary: available by anonymous ftp
Originator: news@nimaster
Keywords: Searle, Chinese Room, Virtual Reality, Mind Modelling
Sender: news@Princeton.EDU (USENET News System)
Nntp-Posting-Host: phoenix.princeton.edu
Organization: Princeton University
Date: Tue, 25 Aug 1992 21:38:21 GMT
Lines: 64

The following article is retrievable by anonymous ftp on host princeton.edu
directory pub/harnad filename harnad92.virtualmind
                ----
Hayes, P., Harnad, S., Perlis, D. & Block, N. (1992) Virtual Symposium
on Virtual Mind. Minds and Machines 2(3) 217-238.
                ----

		VIRTUAL SYMPOSIUM ON VIRTUAL MIND

		Patrick Hayes
                CSLI
		Stanford University

		Stevan Harnad
		Psychology Department
		Princeton University
		
		Donald Perlis
		Department of Computer Science
		University of Maryland
		
		Ned Block
		Department of Philosophy and Linguistics
		Massachussetts Institute of Technology


KEYWORDS: Chinese Room Argument; Searle; Turing Test; computationalism;
functionalism; hermeneutics; implementation; mind; other-minds problem;
robotics; semantics; symbol grounding; virtual reality.

ABSTRACT:

When certain formal symbol systems (e.g., computer programs)
are implemented as dynamic physical symbol systems (e.g., when they are
run on a computer) their activity can be interpreted at higher levels
(e.g., binary code can be interpreted as LISP, LISP code can be
interpreted as English, and English can be interpreted as a meaningful
conversation). These higher levels of interpretability are called
"virtual" systems. If such a virtual system is interpretable as if it
had a mind, is such a "virtual mind" real?

This is the question addressed in this "virtual" symposium, originally
conducted electronically among four cognitive scientists: Donald
Perlis, a computer scientist, argues that according to the
computationalist thesis, virtual minds are real and hence Searle's
Chinese Room Argument fails, because if Searle memorized and executed a
program that could pass the Turing Test in Chinese he would have a
second, virtual, Chinese-understanding mind of which he was unaware (as
in multiple personality). Stevan Harnad, a psychologist, argues that
Searle's Argument is valid, virtual minds are just hermeneutic
overinterpretations, and symbols must be grounded in the real world of
objects, not just the virtual world of interpretations. Computer
scientist Patrick Hayes argues that Searle's Argument fails, but
because Searle does not really implement the program: A real
implementation must not be homuncular but mindless and mechanical, like a
computer. Only then can it give rise to a mind at the virtual level.
Philosopher Ned Block suggests that there is no reason a mindful
implementation would not be a real one.


-- 
Stevan Harnad  Department of Psychology  Princeton University
harnad@clarity.princeton.edu / harnad@pucc.bitnet / srh@flash.bellcore.com 
harnad@learning.siemens.com / harnad@elbereth.rutgers.edu / (609)-921-7771


