Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!gatech!newsfeed.internetmci.com!in1.uu.net!news.interpath.net!sas!newshost.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: Q: Small or large weights ?
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <Dtu4zv.Czu@unx.sas.com>
Date: Sun, 30 Jun 1996 22:55:07 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References: <4qrk8c$96f@eng_ser1.erg.cuhk.hk> <4r0tlt$nge@eng_ser1.erg.cuhk.hk>
Organization: SAS Institute Inc.
Lines: 37


In article <4r0tlt$nge@eng_ser1.erg.cuhk.hk>, ccszeto@cs.cuhk.hk (Szeto Chi Cheong) writes:
|> There are two learning strategies
|> (1) small number of large weights
|>     e.g. weight elimination, pruning
|> (2) large number of small weights
|>     e.g. weight decay
|> Which one is expected to give better generalization ?

It depends on the data. But for linear models, unless there is a strong
prior reason for suspecting that only a few variables are relevant, a
large number of small weights tend to generalize better. A good recent
study:

   Frank, I.E. & Friedman, J.H. (1993) "A statistical view of some
   chemometrics regression tools," Technometrics, 35, 109-148.

As for more general neural nets, I have never seen a systematic study
approaching the scope of Frank and Friedman. For the case of MLPs with a
single input, there is:

   Sarle, W.S. (1995), "Stopped Training and Other
   Remedies for Overfitting," Proceedings of the 27th Symposium on 
   the Interface, ftp://ftp.sas.com/pub/neural/inter95.ps.Z
   (this is a large compressed postscript file, 747K)

This simulation indicates a slight advantage for a large number of small
weights. The main trouble with using a small number of large weights is
that you are likely to get spurious near-discontinuities in the output,
but of course if the target function actually _has_ discontinuities, you
can identify them more reliably with a small number of large weights.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
