Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!gatech!concert!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: Marquardt-Levenberg Backpropagation
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <D1FsJH.I3I@unx.sas.com>
Date: Mon, 26 Dec 1994 21:18:53 GMT
References:  <D16p3M.CKr@osuunx.ucc.okstate.edu>
Nntp-Posting-Host: hotellng.unx.sas.com
Organization: SAS Institute Inc.
Lines: 16


In article <D16p3M.CKr@osuunx.ucc.okstate.edu>, fmeng@rsa.ceat.okstate.edu (FUN MENG HOCK) writes:
|>      Somebody was inqiured about the reference for Marquardt-Levenberg
|> backpropagation method ( the fastest convergence in backpropagation for
|>  a feedforward Neural Network).

No, L-M is not the fastest converging training method. There is NO SUCH
THING as a fastest training method. L-M is certainly among the faster
methods for small networks, but if you have a net with 10,000 weights,
even standard backprop might be faster.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
