From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!uwm.edu!rutgers!rochester!cantaloupe.srv.cs.cmu.edu!crabapple.srv.cs.cmu.edu!andrew.cmu.edu!fb0m+ Wed Dec 18 16:02:40 EST 1991
Article 2236 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!uwm.edu!rutgers!rochester!cantaloupe.srv.cs.cmu.edu!crabapple.srv.cs.cmu.edu!andrew.cmu.edu!fb0m+
>From: fb0m+@andrew.cmu.edu (Franklin Boyle)
Newsgroups: comp.ai.philosophy
Subject: Re: Scaled up slug brains
Message-ID: <IdHqyqK00UhW01duAN@andrew.cmu.edu>
Date: 18 Dec 91 15:56:38 GMT
Organization: Cntr for Design of Educational Computing, Carnegie Mellon, Pittsburgh, PA
Lines: 24

Stanley Friesen writes:

>In article <40705@dime.cs.umass.edu> yodaiken@chelm.cs.umass.edu
(victor yodaik
>en) writes:
>|There is no a priori reason to
>|believe that thoughts can arise from transistors, or that collections of
>|devices which mimic some of the electrical behavior of a neuron will be
>|sufficient to produce consciousness or even complex problem solving. We
>|can flap our hands up and down all day, but we still won't fly.
> 
>But we *can* add up numbers all day and get a total.  The anaology with a
>physical process is only valid if cognition is *primarily* a physical process.
>If it is primarily an informational process, then informational processes
>can duplicate it in reality.
> 
>And I believe that the "mind" shows all of the properties of an informational
>process.  So your analogy is irrelevant.

Since informational processes are also physical, could you elaborate on the
differences implied here between "physical process" and "informational
process"?

-Frank


