Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!news.ultranet.com!news.sprintlink.net!redstone.interpath.net!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: Origins of GRNN
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <DCM9xw.442@unx.sas.com>
Date: Tue, 1 Aug 1995 05:29:56 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References:  <1995Jul31.174825.26852@enterprise.rdd.lmsc.lockheed.com>
Organization: SAS Institute Inc.
Keywords: general regression neural networks, GRNN, PNN, Parzen windows
Lines: 38


In article <1995Jul31.174825.26852@enterprise.rdd.lmsc.lockheed.com>, Don Specht writes:
|> Subject: Origins of GRNN
|> Prof. Ripley of Oxford and Warren Sarle of SAS have commented in recent postings
|> that the General Regression Neural Network (GRNN) is similar to the kernel
|> regression of statistical literature.  

Not just similar, but identical to Nadaraya-Watson kernel regression.

|> Prof. Ripley writes, "Kernel discriminant
|> analysis has been widely used in medical and chemical applications through the
|> work of Hermans, Habbema and co-workers and their ALLOC-80 package.  They have
|> been the true enablers of this methodology, about 15 years ago."
|> He says that I should have referenced Hermans, et al in my paper, "A General
|> Regression Neural Network," which appeared in the IEEE Trans. on Neural
|> Networks, Vol. 2, Nov. 1991.  Perhaps I should have to tie things together, but
|> not to acknowledge priority.  The GRNN paper is a reformulation of an earlier
|> paper in terms of neural networks, with emphasis on the parallel structure which
|> can be used in dedicated parallel hardware.  The earlier paper derives what is
|> now known as kernel regression plus the Volterra series approximation to it,
|> and is:
|> "A practical technique for estimating general regression surfaces," by D. F.
|> Specht, June 1968, Lockheed report LMSC 6-79-68-6, Defense Technical Information
|> Center AD-672505, and also available from NASA, aquisition number N68-29513.

Priority goes to:

   Nadaraya, E.A. (1964) "On estimating regression", Theory Probab.
   Applic. 10, 186-90.

   Watson, G.S. (1964) "Smooth regression analysis", Sankhya,
   Series A, 26, 359-72.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
