Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!cam-news-feed3.bbnplanet.com!news.bbnplanet.com!cam-news-hub1.bbnplanet.com!www.nntp.primenet.com!nntp.primenet.com!howland.erols.net!news.sprintlink.net!news-peer.sprintlink.net!news.sprintlink.net!news-hub.sprintlink.net!news.sprintlink.net!news-stk-11.sprintlink.net!interpath!news.interpath.net!news.interpath.net!sas!newshost.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: BackProp convergence
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <E280K2.BDJ@unx.sas.com>
Date: Tue, 10 Dec 1996 23:14:26 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References: <32A6A1DE.41C6@dxcoms.cern.ch> <586kfl$j9t@dscomsa.desy.de>
Organization: SAS Institute Inc.
Lines: 27


In article <586kfl$j9t@dscomsa.desy.de>, sieversm@zedy05.desy.de (Michael Sievers) writes:
|> john nigel gamble (gamble@dxcoms.cern.ch) wrote:
...
|> : If I continue to train the network the rms error keeps
|> : improving (goes to 1 e-13 and better) and the weights go to
|> : weight limits that I have imposed (i.e. +-20) rather than
|> : stabilising on some other "balanced" value.
...
|> If this bothers you, try using a step function for output (==Heaviside
|> funcion). 

A step function will not work very well with backprop (let's not get
into another discussion on the derivative of a step function). If you
want to avoid infinite weights in noise-free classification problems,
you can use something like a sine function on [-pi/2,pi/2] that
saturates outside a bounded interval. Of course, people tend to
worry about saturation, but I have never found it to be much of
a problem when using conventional optimization algorithms.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
 *** Do not send me unsolicited commercial or political email! ***

