Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!nntp.sei.cmu.edu!news.cis.ohio-state.edu!math.ohio-state.edu!howland.reston.ans.net!newsfeed.internetmci.com!in2.uu.net!news.interpath.net!sas!newshost.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: RBF Intro?
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <DsHvu6.3Ku@unx.sas.com>
Date: Tue, 4 Jun 1996 21:32:30 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References:  <4p1g18$sr9@news.nyu.edu>
Organization: SAS Institute Inc.
Lines: 25


In article <4p1g18$sr9@news.nyu.edu>, cera7013@sparky.cs.nyu.edu (Ethan Cerami) writes:
|> Can anyone tell me where I can find a good, relatively simple introduction to
|> Radial Basis Function Networks?  I have looked through the FAQ on MLP v. RBFs,
|> and have also read through Chapter 7 of Simon Haykin, "Neural Networks:  A
|> Comprehensive Foundation."  But, I am looking for simpler, more comprehensible 
|> explanations.

There are few good references on RBF networks. Bishop (1995) gives
one of the better surveys, but also see Tao (1993) for the
importance of normalization.

   Bishop, C.M. (1995), <cite>Neural Networks for Pattern Recognition,</cite>
   Oxford: Oxford University Press, especially section 6.4. 

   Tao, K.M. (1993), "A closer look at the radial basis function (RBF)
   networks," <cite>Conference Record of The Twenty-Seventh Asilomar 
   Conference on Signals, Systems and Computers</cite> (Singh, A., ed.),
   vol 1, 401-405, Los Alamitos, CA: IEEE Comput. Soc. Press.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
