From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!uwm.edu!psuvax1!rutgers!micro-heart-of-gold.mit.edu!news.media.mit.edu!minsky Tue Feb 11 15:24:51 EST 1992
Article 3516 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!uwm.edu!psuvax1!rutgers!micro-heart-of-gold.mit.edu!news.media.mit.edu!minsky
>From: minsky@media.mit.edu (Marvin Minsky)
Newsgroups: comp.ai.philosophy
Subject: Re: Strong AI and panpsychism
Message-ID: <1992Feb4.151115.5600@news.media.mit.edu>
Date: 4 Feb 92 15:11:15 GMT
References: <1992Jan31.193524.28969@psych.toronto.edu> <1992Jan31.233453.7625@news.media.mit.edu> <1992Feb3.113723.2519@arizona.edu>
Sender: news@news.media.mit.edu (USENET News System)
Distribution: world,local
Organization: MIT Media Laboratory
Lines: 63
Cc: minsky

In article <1992Feb3.113723.2519@arizona.edu> bill@NSMA.AriZonA.EdU (Bill Skaggs) writes:
>In article <1992Jan31.233453.7625@news.media.mit.edu> 
>minsky@media.mit.edu (Marvin Minsky) writes:
>>
>>What's more, I don't even see why those formal systems even need to be
>>run on real computers, if they are specified complete with their
>>environments.  Those virtual beings, just as "conscious" as me and
>>(presumably) you, can lead arbitrarily rich, imaginative lives, or
>>whatever.
>>
>  Well, here's the problem, as I see it:  
>
>  Consider an arbitrary rock, and an arbitrary finite state
>automaton.  There exists a mapping from vibrational states
>of the rock to states of the FSA which preserves the state
>transition function of the FSA.  (The mapping is probably
>time-dependent, but so what?)  Under this mapping, the rock
>is performing the same computation as the FSA.
>
>  Therefore, if an FSA can be conscious, and consciousness is
>merely a matter of performing the right sort of computation,
>then a rock can be conscious.
>
>  What's wrong with this reasoning?
>
>	-- Bill

Nothing.  So long as it performs the _right sort_ of computation.  But
there's no reason to think that rocks can do this. Brains, in good
health, can -- but this is a result of 3 billion years of evolution.
Rocks don't evolve because of not having hereditary structural codes,
etc.

No what's the right kind of computation.  What I've been trying to say
is that what we call consciousness is a big collection of functions,
but the most striking (in my opinion) are the functions that give rise
to being able to think about (that is, make calculations and plans
based on) descriptions of recent events.  You can quarrel about the
precise definitions, but it is the functional aspects of this (not the
"subjective" ones) that concern me here.  To do such things, you need
neural nets that have some nice memory properties and some nice
pattern-recognition properties.  Otherwise, it's just any old FSA.  

The point I was trying to make was to resist the idea of consciousness
as a mysterious indescribable attribute and, instead, to try to
describe various (objective) activities that might serve to span the
range of phenomena -- call them subjective if you like -- that we
associate with that term.  This ought to approach a certain
comprehensiveness, simply because if there remains anything more one
can _say_ about it, then that aspect has a remaining objective
component.  And this isn't quite a "behavioristic" position because it
would include brain-probes and other physical interventions.  

Anyway, there are lots of "partial" mechanisms in other kinds of
matter besides brain, mechanisms that could give rise to some facets
of mind-like behavior, but you'd need a lot of them to get anything
much like "thinking".  I have no scorn for rocks, and perhaps the
state changes during the propagation of lattice imperfections under
stress are quite respectably complex.  There are FSAs and FSAs.  But
there are lots of reasons to be chauvinistic about some as opposed to
others...

-- marvin minsky


