From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!usc!wupost!micro-heart-of-gold.mit.edu!news.media.mit.edu!minsky Tue Feb 11 15:25:49 EST 1992
Article 3591 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!usc!wupost!micro-heart-of-gold.mit.edu!news.media.mit.edu!minsky
>From: minsky@media.mit.edu (Marvin Minsky)
Subject: Re: Strong AI and panpsychism
Message-ID: <1992Feb8.033821.16351@news.media.mit.edu>
Sender: news@news.media.mit.edu (USENET News System)
Cc: minsky
Organization: MIT Media Laboratory
References: <1992Feb3.113723.2519@arizona.edu> <1992Feb4.151115.5600@news.media.mit.edu> <1992Feb6.113740.2533@arizona.edu>
Distribution: world,local
Date: Sat, 8 Feb 1992 03:38:21 GMT
Lines: 55

In article <1992Feb6.113740.2533@arizona.edu> bill@NSMA.AriZonA.EdU (Bill Skaggs) writes:
>minsky@media.mit.edu (Marvin Minsky) writes:
>>What's more, I don't even see why those formal systems even need to be
>>run on real computers, if they are specified complete with their
>>environments.  Those virtual beings, just as "conscious" as me and
>>(presumably) you, can lead arbitrarily rich, imaginative lives, or
>>whatever.
>
>Bill Skaggs
>>  Well, here's the problem, as I see it:  
>>
>>  Consider an arbitrary rock, and an arbitrary finite state
>>automaton.  There exists a mapping from vibrational states
>>of the rock to states of the FSA which preserves the state
>>transition function of the FSA.  (The mapping is probably
>>time-dependent, but so what?)  Under this mapping, the rock
>>is performing the same computation as the FSA.
>>
>>  Therefore, if an FSA can be conscious, and consciousness is
>>merely a matter of performing the right sort of computation,
>>then a rock can be conscious.
>>
>>  What's wrong with this reasoning?
>
>MM:
>>Nothing.  So long as it performs the _right sort_ of computation.  But
>>there's no reason to think that rocks can do this. Brains, in good
>>health, can -- but this is a result of 3 billion years of evolution.
>>Rocks don't evolve because of not having hereditary structural codes,
>>etc.
>
[...]
>But what sorts of mappings are allowed?  If any arbitrary, time-
>dependent mapping is acceptable, then the dynamics of *any*
>object can be mapped to *any* Turing machine -- so every
>object is simultaneously performing every possible computation.
>
>Therefore, if we want to avoid rabid panpsychism, we must restrict
>the set of allowable mappings -- but I would claim that restricting
>the set of mappings amounts to grounding the system.
>
>	-- Bill
>
>P.S. I understand that Putnam has made essentially the same
>argument somewhere.

Can you explain what you [Putnam] means by a time-dependent mapping?
If it means what I fear, there's no reason to call it a mapping.  On
the other hand, non-time-dependent state-automata naturally fall into
equivalence classes under isomorphism, which you could regard as
"grounding" if you want. I presume that Putnam's time dependent
mappings are determined by another automaton?  He must have something
to keep all the successive states from being indistinguishable?




