Newsgroups: comp.ai.neural-nets
From: mlloyd@oxfordll.demon.co.uk (Matthew Lloyd)
Path: cantaloupe.srv.cs.cmu.edu!das-news.harvard.edu!news2.near.net!MathWorks.Com!europa.eng.gtefsd.com!howland.reston.ans.net!news.sprintlink.net!demon!oxfordll.demon.co.uk!mlloyd
Subject: Re: HELP ON BACKPROP NN
Distribution: world
References: <3600e3$1kgi@campus.mty.itesm.mx>
Organization: Myorganisation
Reply-To: mlloyd@oxfordll.demon.co.uk
X-Newsreader: Demon Internet Simple News v1.29
Lines: 30
Date: Sat, 24 Sep 1994 20:02:19 +0000
Message-ID: <780436939snz@oxfordll.demon.co.uk>
Sender: usenet@demon.co.uk

In article <3600e3$1kgi@campus.mty.itesm.mx>
           abbrito@mor.itesm.mx "Alexandro Brito B." writes:

> 
> Does anybody Know why  Backpropropagation Neural Nets
> does not learn when all the weights are initialized with the same value ?
> 

Yes, I do. If you run a pattern through a BP network with all the weights
identical, the output of each unit will also be identical. Now,
when you backpropogate the error through the network, each weight
is changed proportional to:

   a) the original weight
   b) the output of the unit at one end of the weight
   c) the error of the unit at the other end of the weight

Consider 2 weights connecting hidden units the the same output unit.
a) will be identical; so will b) (see above); so too will be c) 
(the same output unit has the same error) - thus both weights
will be changed by the same amount. This situation will repeat for all weights
in the network - i.e. all the weights are changed by the same amount
in each layer at each timestep. Thus, the error is not reduced - it
is only differences in weights that produce a useful network (in MOST
cases). The network does not learn.


Matthew Lloyd, mlloyd@oxfordll.demon.co.uk
(Oxford, England, aged 14)

