Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!nntp.sei.cmu.edu!cis.ohio-state.edu!math.ohio-state.edu!howland.reston.ans.net!newsjunkie.ans.net!butch!enterprise!news
From: Don Specht
Subject: Re: Origins of GRNN
Message-ID: <1995Aug4.184405.27987@enterprise.rdd.lmsc.lockheed.com>
Followup-To: Re: Origins of GRNN
Keywords: general regression neural networks, GRNN, PNN, Parzen windows
Organization: Lockheed
X-Newsreader: <WinQVT/Net v3.9>
Date: Fri, 4 Aug 95 18:44:05 GMT
Lines: 20

Thanks, Dr. Sarle, for the references.  I have sent away to get copies.
You didn't comment on the other 3/4 of my 1968 report, namely, the Volterra 
polynomial representation.  I did formalize it later and published it in the 
statistical literature:
"Series Estimation of a Probability Density Function," D. F. Specht, 
Technometrics, 13,409-424 (May 1971).
This is a general paper since nonparametric estimation of pdf's is basic to 
both regression and classification.  The technique of this paper has both good 
and bad points.  It provides for one-pass learning with a fixed number of 
coefficients that doesn't grow regardless of the size of the training database. 
However, the number of cross products grows factorially with the dimensionality 
of the problem.  it is therefore useful for large databases with small 
dimensionality.  With pruning of high-order cross products, it is useful also 
for larger dimensionality.
Dr. Sarle, since you evidentally are interested in the history of statistical 
techniques, I would like to ask you a favor.  Can you tell me if this pdf 
estimation technique has been independently invented either before or after my 
1968 report or 1971 paper?  If so, I will reference them in the future.  I am 
sending you a copy of the paper by regular mail.

