From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!think.com!news.bbn.com!papaya.bbn.com!cbarber Mon Dec 16 11:00:31 EST 1991
Article 1974 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!think.com!news.bbn.com!papaya.bbn.com!cbarber
>From: cbarber@bbn.com (Chris Barber)
Newsgroups: comp.ai.philosophy
Subject: Re: Neuron based neural nets
Message-ID: <3962@coconut.bbn.com>
Date: 9 Dec 91 15:55:13 GMT
References: <3942@papaya.bbn.com> <58114@netnews.upenn.edu> <3949@papaya.bbn.com> <58348@netnews.upenn.edu>
Organization: BBN Systems and Technology, Inc.
Lines: 28

In article <58348@netnews.upenn.edu> weemba@libra.wistar.upenn.edu 
        (Matthew P Wiener) writes:

>I have no idea why you say that any neuronal backpropagation would
>be chemical.

Because electric impulses travel outward from the body of the neuron out
to the end of the axon, not the other way around.  Backpropagation requires
that somehow a signal travels from neurons in deeper layers back (through
the axon) to earlier neurons.  If this happens at all, it must be through
some type of chemical signal.  In fact, even regular axonal pulses are
really chemical in nature and are not pure electric signals (they do not
travel the speed of light).  An axonal pulse is really a chain reaction
involving the sudden influx of ions into the axon.  Also, neurons can only
learn through chemical mechanisms.

>So they model feedback via climbing fibers, not chemistry.  Whether
>this counts as backpropagation per se, I don't know.

This isn't backpropagation at all. What they seem to be talking about here is
a kind of feedback loop.  The climbing fibers project back to the neurons that
are supposed to learn and thus can influence them without resorting to
backpropagation "magic".


-- 
Christopher Barber
(cbarber@bbn.com)


