Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!fas-news.harvard.edu!newspump.wustl.edu!news.starnet.net!wupost!howland.reston.ans.net!news.sprintlink.net!redstone.interpath.net!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: help: conjugate gradient/newton's method
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <D42Auu.1F3@unx.sas.com>
Date: Wed, 15 Feb 1995 22:08:54 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References:  <randy-1402951349080001@rm3346m1.cs.byu.edu>
Organization: SAS Institute Inc.
Lines: 24


In article <randy-1402951349080001@rm3346m1.cs.byu.edu>, randy@axon.cs.byu.edu (Randy Wilson) writes:
|>   I am trying to get a grip on what I am calling 'backprop acceleration
|> techniques', though they are probably really just different techniques
|> altogether.  Namely, I am trying to understand Newton's Method, Conjugate
|> Gradient techniques, and other techniques that use the second derivative
|> of the error function to (hopefully) more quickly search the weight space
|> of a multi-layer network. ...  I was wondering if you happen to
|> have any references on some good papers that might help me understand this
|> stuff. 

   Dennis, J.E. and Schnabel, R.B. (1983) Numerical Methods for
   Unconstrained Optimization and Nonlinear Equations, Prentice-Hall

   Fletcher, R. (1987) Practical Methods of Optimization, Wiley: NY.

   Gill, E.P., Murray, W. and Wright, M.H. (1981) Practical
   Optimization, Academic Press: London.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
