Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!gatech!newsfeed.internetmci.com!howland.reston.ans.net!psinntp!psinntp!psinntp!psinntp!megatest!news
From: Dave Jones <djones>
Subject: Re: General Regression Neural Net (GRNN)
Content-Type: text/plain; charset=iso-8859-1
Message-ID: <DMor70.1Jw@Megatest.COM>
Sender: news@Megatest.COM (News Admin)
Nntp-Posting-Host: pluto
Content-Transfer-Encoding: 8bit
Organization: Megatest Corporation
References: <4f78hl$m6a@goya.eunet.es> <5rDJJG+.predictor@delphi.com>
Mime-Version: 1.0
Date: Mon, 12 Feb 1996 23:13:47 GMT
X-Mailer: Mozilla 1.1N (X11; I; SunOS 5.4 sun4m)
X-Url: news:5rDJJG+.predictor@delphi.com
Lines: 38

Will Dwinnell <predictor@delphi.com> wrote:
>Jose Parga <bolsamad@dial.eunet.es> writes:
> 
>>I am reading a E. Gatelys book and in some chapters
>>he refers to General Regression Neural Networks (GRNN).
>>
>>Does anybody know what is it?

It's using a kernel estimator to approximate arbitrary functions.

> ... and where can I find
>>information of the algorithm?
> 
>.
>Have you checked the FAQ?  You might also try Wasserman's "Advanced
>Methods in Neural Computing".

I very definitely DO NOT recommend that book. Timothy Masters'
_Advanced Neural Network Recipes_ (or something like that) is pretty good,
so far as it goes. The tricky bit with kernel systems is generally
in the calculation of the window size. To really understand what's going on
there, check out B.W. Silverman's _Density Estimation for Statistics and Data
Analysis_. There is nothing in it about the GRNN per se, but it covers kernel
density estimators very lucidly. The GRNN is a straight-forward application of
that method, as is the "PNN", also known as the Parzen-Bayes classifier. In
Silverman you will learn about establishing window sizes and of obscure but
useful things like the "Epanechnikov kernel", which I have found invaluable.

           Dave





>.
>Will Dwinnell
>Commercial Intelligence Inc.

