From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!sun-barr!olivea!mintaka.lcs.mit.edu!yale!cs.yale.edu!mcdermott-drew Thu Feb 20 15:21:32 EST 1992
Article 3808 of comp.ai.philosophy:
Xref: newshub.ccs.yorku.ca comp.ai.philosophy:3808 sci.philosophy.tech:2153
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!sun-barr!olivea!mintaka.lcs.mit.edu!yale!cs.yale.edu!mcdermott-drew
>From: mcdermott-drew@CS.YALE.EDU (Drew McDermott)
Newsgroups: comp.ai.philosophy,sci.philosophy.tech
Subject: Re: Reference (was re: Multiple Personality Disorder and Strong AI)
Keywords: consciousness,functionalism,meaning
Message-ID: <1992Feb17.205805.2012@cs.yale.edu>
Date: 17 Feb 92 20:58:05 GMT
References: <1992Feb12.063035.15857@organpipe.uug.arizona.edu> <1992Feb13.045721.29805@cs.yale.edu> <1992Feb13.201109.25439@psych.toronto.edu>
Sender: news@cs.yale.edu (Usenet News)
Organization: Yale University Computer Science Dept., New Haven, CT 06520-2158
Lines: 65
Nntp-Posting-Host: atlantis.ai.cs.yale.edu


  In article <1992Feb13.201109.25439@psych.toronto.edu> michael@psych.toronto.edu (Michael Gemar) writes:
  >In article <1992Feb13.045721.29805@cs.yale.edu> mcdermott-drew@CS.YALE.EDU (Drew McDermott) writes:
  >>
  >>
  >information
  >>processing is ... real, in spite of the attempts by people like
  >>... Gemar ...to argue that whether a system is an
  >>information processor is purely up to human observers (who confer
  >>derived intentionality on it).
  >
  >Wait just a minute here!  As the "Gemar" referred to in this quote,
  >I want to make sure that my position isn't mispresented.  I am a 
  >*realist* with regards to intentionality.  I *know* that I possess it,
  >and I believe that at least other humans possess it too.  I believe that,
  >contrary to what is implied above, there is an actual *fact of the matter*
  >whether information processing systems have it as well.  I *don't* believe
  >that it is only a matter of "taking a stance."  Atoms are real, and so
  >are minds, "regardless of whether anyone is taking any stance toward them."

Sorry for being so sloppy.  I meant the kind of assertion that Searle
makes that the only reason to call my workstation an information
processor (instead of an expensive space heater) is that humans use it
to process information, and interpret its outputs as pieces of
information.  (I believe Searle makes this kind of claim in his
rebuttal to the commentaries on his latest BBS piece.)  

  >I am also not so sure that I would agree that it is simply up to human
  >observers to decide whether a system is processing information.  If
  >we are talking about Shannon and Weaver-type "information," sans
  >semantic content (heck, without any content of any kind), then I 
  >think that "information processing" can be objectively defined
  >independent of observers.  It's when semantics creeps in through
  >the back door that I begin to complain.

Whereas I think semantics is as objective as entropy, but I don't want
to go over the same arguments again.

  >I am *happy* to assert that whether or not computers have minds like
  >us is a *fact*, and not merely a stance.  Indeed, it seems to me that it
  >is the *functionalists* who insist on stances and description-relative
  >truths when talking of minds.

Not when we're being careful.

Here's an idle observation: When the human race discovers a new kind
of thing, there is an initial tendency to view the new kind as being
in the mind of the beholders (that is, not really there, but
convenient for our minds to think in terms of).  After a generation or
two, when the new kind of thing is taken for granted, people just
classify things of that kind are real.  For example, in the
seventeenth century gravity was held to be an embarrassing theoretical
construct, which would eventually be supplanted by something that
could more obviously be said to really exist.  In the nineteenth
century atoms went through this stage.  Several quantum-mechanical
entities are still in it.  It seems clear to me that our current
tendency to see semantics as requiring a conscious mind as an anchor
is more of the same.  The competing idea that semantics is just
another natural phenomenon will seem unavoidable to our descendants.
Not only with they be surrounded by computers that maintain and use
representations of their environments, but they'll also be familiar
with natural systems like DNA molecules, which obviously display the
kernel features of semantics and "intentionality."

                                             -- Drew McDermott


