Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!godot.cc.duq.edu!newsgate.duke.edu!news.mathworks.com!newsfeed.internetmci.com!in1.uu.net!orincon.com!news
From: Dopey Dope <jessem>
Subject: Re: Code for Conjugate Grdients Search
X-Nntp-Posting-Host: orcas.orincon.com
Content-Type: text/plain; charset=us-ascii
Message-ID: <DsD3y7.Jy2.0.-s@orincon.com>
To: xiaodong@otago.ac.nz
Sender: news@orincon.com (News System)
Content-Transfer-Encoding: 7bit
Organization: Orincon Corporation
References: <1996May31.125853.1@otago.ac.nz>
Mime-Version: 1.0
Date: Sun, 2 Jun 1996 07:39:42 GMT
X-Mailer: Mozilla 1.1 (X11; U; SunOS 4.1.3 sun4c)
X-Url: news:1996May31.125853.1@otago.ac.nz
Lines: 24

xiaodong@otago.ac.nz wrote:
>Hi, I am looking for code that does Conjugate Gradients Search. Has anybody
>ever tried this search algorithms for neural net training? It is claimed by
>some people as the best line-minimization algorithm, even more robust and
>faster than a Back-propagation search (see Timothy Masters's Practical Neural
>Network Recipes in C++). Timothy has got some source code for this algorithm,
>but I want to see any other available options, maybe a simplified one if there
>is any. If you know this algorithm, how do you evaluate it? I mean how well you
>think it searches for the minimum?
>
>Thanks in advance.
>
>-- Xiaodong
>
>
There is a working version in the Numerical Recipe in C.  It does perform
faster than the usual steepest descent.  I am not certain if it's "more
robust and faster than the back-propagation".  The backpropagation algorithm
tries to minimize the output error.  The actual minimization process can be
performed by steepest descent, conjugate gradient, or quickprop.  But
nevertheless, the underlying backpropagation should be roughly the same.

Jesse

