Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news.harvard.edu!news2.near.net!MathWorks.Com!news.duke.edu!news-feed-1.peachnet.edu!concert!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Weight decay
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <Cw8LHp.Ho@unx.sas.com>
Date: Fri, 16 Sep 1994 19:15:25 GMT
References: <34f6eo$ao@oreig.uji.es> <CvsLyt.6EB@unx.sas.com> <35476h$grd$1@heifetz.msen.com>
Nntp-Posting-Host: hotellng.unx.sas.com
Organization: SAS Institute Inc.
Lines: 33


In article <35476h$grd$1@heifetz.msen.com>, csi@garnet.msen.com (AI Group) writes:
|> Warren Sarle (saswss@hotellng.unx.sas.com) wrote:
|> ...
|> : If you want to use a fast learning rate but still keep the weights
|> : from becoming excessively large, try weight decay.
|> I am not familiar with weight decay, could you explain that alittle more.

Weight decay adds a penalty term to the training criterion that
penalizes large weights. The penalty is usually the sum of squared
weights times a tuning constant, in which case weight decay is a
nonlinear version of ridge regression, which is known to improve
generalization in the linear case. Various other penalty terms
have also been proposed. See:

   Fahlman, S.E. (1989), "Faster-Learning Variations on
   Back-Propagation: An Empirical Study", in Touretzky, D., Hinton, G, and
   Sejnowski, T., eds., _Proceedings of the 1988 Connectionist Models
   Summer School_, Morgan Kaufmann, 38-51.

   Hertz, J., Krogh, A. & Palmer, R.G. (1991), _Introduction to the
   Theory of Neural Computation_, Addison-Wesley.

   Reed, R. (1993) "Pruning Algorithms--A Survey", IEEE Transactions
   on Neural Networks, 4, 740-747.

(I don't consider weight decay a form of pruning, but some people do.)

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
