From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rutgers!cs.utexas.edu!qt.cs.utexas.edu!yale.edu!yale!hsdndev!news.bbn.com!papaya.bbn.com!cbarber Mon Dec  9 10:47:25 EST 1991
Article 1798 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rutgers!cs.utexas.edu!qt.cs.utexas.edu!yale.edu!yale!hsdndev!news.bbn.com!papaya.bbn.com!cbarber
>From: cbarber@bbn.com (Chris Barber)
Newsgroups: comp.ai.philosophy
Subject: Re: Dennett on Edelman--what a total loss
Message-ID: <3942@papaya.bbn.com>
Date: 2 Dec 91 15:46:15 GMT
References: <57569@netnews.upenn.edu> <1991Nov27.031545.11235@bronze.ucs.indiana.edu> <57730@netnews.upenn.edu> <1991Nov29.050859.21552@bronze.ucs.indiana.edu>
Organization: BBN Systems and Technology, Inc.
Lines: 21

In article <1991Nov29.050859.21552@bronze.ucs.indiana.edu> 
        chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:

>In other words, he has identified connectionism entirely with
>the use of Hopfield nets and Boltzmann machines, which in fact
>form a small and non-central subset of the field.  He appears not
>to have heard of backpropagation, for instance, which has been
>much more central to connectionist practice, and which fits none
>of the descriptions above.

Maybe this is because backpropagation cannot be implemented with
real neurons!  In fact, most neural network paradigms, have nothing
to do with the way real neurons and brains work.  Most of the people
I have met who work on neural nets have never taken a neurophysiology
course and don't intend to.  This is not to say that everyone is
wasting their time.  Backpropagation as a compuational learning paradigm 
is fairly successful, it's just not what the brain does.

-- 
Christopher Barber
(cbarber@bbn.com)


