Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!gatech!newsfeed.internetmci.com!portal.gmu.edu!hearst.acc.Virginia.EDU!news-server.ncren.net!redstone.interpath.net!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: General Regression Neural Net (GRNN)
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <DMMsrL.F1x@unx.sas.com>
Date: Sun, 11 Feb 1996 21:52:33 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References:  <4f78hl$m6a@goya.eunet.es>
Organization: SAS Institute Inc.
Lines: 34


In article <4f78hl$m6a@goya.eunet.es>, Jose Parga <bolsamad@dial.eunet.es> writes:
|> I am reading a E. Gatelys book and in some chapters
|> he refers to General Regression Neural Networks (GRNN).
|> 
|> Does anybody know what is it? and where can I find
|> information of the algorithm?

GRNN is a neologism for Nadaraya-Watson kernel regression, which has
been reinvented twice (that I know of) in the neural net literature:

   Specht, D.F. (1991) "A Generalized Regression Neural Network",
   IEEE Transactions on Neural Networks, 2, Nov. 1991, 568-576.

   Schi\oler, H. and Hartmann, U. (1992) "Mapping Neural Network
   Derived from the Parzen Window Estimator", Neural Networks, 5, 903-909.

The original references are:

   Nadaraya, E.A. (1964) "On estimating regression", Theory Probab.
   Applic. 10, 186-90.

   Watson, G.S. (1964) "Smooth regression analysis", Sankhy{\=a},
   Series A, 26, 359-72.

A good summary of current theory is in:

   Haerdle, W. (1990), _Applied Nonparametric Regression_, Cambridge
   Univ. Press.
-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
