Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!ceylon!wizard.pn.com!Germany.EU.net!howland.reston.ans.net!gatech!concert!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: # of hidden nodes for Radial Basis Function Networks
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <CzHAuA.Jt1@unx.sas.com>
Date: Fri, 18 Nov 1994 19:44:34 GMT
References: <CyAH5D.F4u@unx.sas.com> <785081021snz@ecowar.demon.co.uk> <3aihbt$db6@gate.fzi.de>
Nntp-Posting-Host: hotellng.unx.sas.com
Organization: SAS Institute Inc.
Lines: 23


In article <785081021snz@ecowar.demon.co.uk>, jimmy@ecowar.demon.co.uk (Jimmy Shadbolt) writes:
|>         IMHO most "constructive" algorithms are unable to (i) prune
|>         underutilised RBFs (ii) add/delete RBFs "on the fly". Can
|>         somebody shed some light on this topic?

Since many types of RBFs are linear models, "construction" and "pruning"
reduce to subset regression, for which there are many algorithms in
the statistical literature, such as stepwise regression. See any
textbook on regression, such as:

   Miller, A.J. (1990), Subset Selection in Regression, Chapman & Hall.

   Raymond H. Myers (1986), _Classical and Modern Regression with
   Applications_, Boston: Duxbury Press

   Sanford Weisberg (1985), _Applied Linear Regression_, NY: Wiley

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
