From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!caen!garbo.ucc.umass.edu!dime!chelm.cs.umass.edu!yodaiken Mon Dec 16 11:02:10 EST 1991
Article 2147 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!caen!garbo.ucc.umass.edu!dime!chelm.cs.umass.edu!yodaiken
>From: yodaiken@chelm.cs.umass.edu (victor yodaiken)
Newsgroups: comp.ai.philosophy
Subject: Re: Scaled up slug brains
Message-ID: <40677@dime.cs.umass.edu>
Date: 16 Dec 91 01:04:42 GMT
References: <robison.692764882@chloro> <40659@dime.cs.umass.edu> <12709@pitt.UUCP>
Sender: news@dime.cs.umass.edu
Organization: University of Massachusetts, Amherst
Lines: 55

In article <12709@pitt.UUCP> geb@dsl.pitt.edu (gordon e. banks) writes:
>In article <40659@dime.cs.umass.edu> yodaiken@chelm.cs.umass.edu (victor yodaiken) writes:
>
>>No argument here. Clearly we are all built of cells, we share many
>>similarities and research on these simpler animals is a logical first
>>step towards understanding more complex ones. Note, however that Gordon
>>Banks makes a much stronger claim. Essentially Banks is arguing that 
>>the difference between slug nervous systems and human brains is just a
>>matter of scale. There is no evidence to support this claim other than the
>
>Those are your words not mine.  It would depend on what you mean by
>scale.  If scale includes complexity of the organization, I would
>agree (the I beam and the gear box are differently organized steel).
>But if you just mean size and numbers, I do not agree.
>

Let's return to the origin of this argument:

sarima@tdatirv.UUCP (Stanley Friesen) wrote in
Re: Searle and the Chinese Room, Message-ID: <302@tdatirv.UUCP>


  The serious error is Searle's reasoning is that he has *never* shown any
  *objective* evidence that my brain is doing anything that a computer attached
  to appropriate input devices could not do.
  
  And, since my knowledge of neurology suggests that all of my mental functions
  are based on electro-chemical reactions in characterizable processing elements,
  I must conclude that however our brain may achive meaning, it is computable.
  
  I do doubt that a pure algorithm, lacking any sensory input modalities
  could show intelligence.  But computers are jsut as capable of processing
  and encoding semse data as the human nervous system.
  
I objected, and then you responded as follows:
>
>In article <40332@dime.cs.umass.edu> yodaiken@chelm.cs.umass.edu (victor yodaiken) writes:
>>This is revolutionary knowledge. Could you please cite some references which
>>provide *objective* evidence that there are *characterizable processing
>>elements* in the brain and that they are the seat of  all mental functions. 
>>
>
>For the processing elements, any textbook on neurophysiology.
>For the connection of the brain with mental functions, any textbook

Now you seem to be retreating from your support of Friesen's strong
AI position. If all you are claiming is that "mind" is a product of
some physical phenomena connected with the brain, then we have no argument.
On the other hand, a claim that "mind" is a "computation" or a claim that
the fundamental principles of thought are understood, or a claim that
human minds can be characterized in the same way as the operation of slug
nervous systems can be characterized, but with just more connections, then 
we disagree.




