Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!godot.cc.duq.edu!newsgate.duke.edu!news.mathworks.com!newsfeed.internetmci.com!in1.uu.net!news.interpath.net!sas!newshost.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: Code for Conjugate Grdients Search
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <DsAB2p.3xE@unx.sas.com>
Date: Fri, 31 May 1996 19:20:49 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References:  <1996May31.125853.1@otago.ac.nz>
Organization: SAS Institute Inc.
Lines: 37


See "What are conjugate gradients, Levenberg-Marquardt, etc.?" in
ftp://ftp.sas.com/pub/neural/FAQ2.html

In article <1996May31.125853.1@otago.ac.nz>, xiaodong@otago.ac.nz writes:
|> Hi, I am looking for code that does Conjugate Gradients Search. Has anybody
|> ever tried this search algorithms for neural net training? 

Yes, it works very well when you have lots of weights.

|> It is claimed by
|> some people as the best line-minimization algorithm, 

No, no, no, no. There is no such thing as "the best minimization 
algorithm"? (Neural nets require multidimensional minimization, not
line minimization.) Different algorithms work better under different
circumstances. See the FAQ.

|> even more robust and
|> faster than a Back-propagation search (see Timothy Masters's Practical Neural
|> Network Recipes in C++). 

Almost anything is faster than standard backprop.

|> Timothy has got some source code for this algorithm,
|> but I want to see any other available options, maybe a simplified one if there
|> is any. If you know this algorithm, how do you evaluate it? I mean how well you
|> think it searches for the minimum?

See references in the FAQ.


-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
