Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news.harvard.edu!news2.near.net!MathWorks.Com!europa.eng.gtefsd.com!howland.reston.ans.net!swrinde!sgiblab!sgigate.sgi.com!enews.sgi.com!decwrl!netcomsv!netcom.com!jbrook
From: jbrook@netcom.com (John Brookes)
Subject: Re: Rigorous Analysis on Back Propagation
Message-ID: <jbrookCwFs3q.1yu@netcom.com>
Organization: NETCOM On-line Communication Services (408 261-4700 guest)
X-Newsreader: TIN [version 1.2 PL1]
References: <34kd9v$b7r@lyra.csx.cam.ac.uk> <Cvttz4.7L1@uwindsor.ca>
Date: Tue, 20 Sep 1994 16:21:26 GMT
Lines: 14

B Lam (blam@hobbes.uwindsor.ca) wrote:
: I'm doing an analysis on back propagation, is there any good book that I can refer to?

: Thanks!!!

: BEN

Rumelhart and Mc Clelland (sp?) give some math, chapt 7 I think, of 
Widrow's classic BP algorithm. Note that derivation assumes accumulating 
error over an entire pass through training set before adjusting weights. 
Incidently, Rumelhart is at Stanford. Also, like statistics, 
mathematicians prefer math that they can differentiate...so sigmoid and 
weight update are done differently in practice than in "skool" 
JB
