Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!uunet!in1.uu.net!olivea!charnel.ecst.csuchico.edu!csusac!csus.edu!news.ucdavis.edu!sunnyboy.water.ca.gov!sunnyboy!nsandhu
From: nsandhu@venice.water.ca.gov (Nicky Sandhu)
Subject: Re: Conjugate Gradient method
In-Reply-To: ma281209@hkpu01.hkp.hk's message of Fri, 10 Mar 1995 17:29:44 GMT
Message-ID: <NSANDHU.95Mar13101521@grizzly.water.ca.gov>
Sender: news@sunnyboy.water.ca.gov
Organization: Calif. Dept. of Water Resources
References: <3jhkod$3sn@eng_ser1.erg.cuhk.hk>
	<TAP.95Mar7103805@pearson.epi.terryfox.ubc.ca>
	<D58J9L.LCw@hkpu01.hkp.hk>
Date: Mon, 13 Mar 1995 18:15:21 GMT
Lines: 9

>>>>> On Fri, 10 Mar 1995 17:29:44 GMT, ma281209@hkpu01.hkp.hk (ma) said:

ma> I am now use recurrent network to model the value of currency, as I found 
ma> that my program is rather slow, did you think that conjuagate gradient
ma> method can help me (I am using steepest descent now).  Another suggestion ?

	Well you can always try quickprop. I use SNNSv3.3 and found it
to be way faster then gradient vanilla backprop for recurrent networks
-Nicky
