Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!bloom-beacon.mit.edu!spool.mu.edu!darwin.sura.net!news.Vanderbilt.Edu!NewsWatcher!user
From: goldenjb@ctrvax.vanderbilt.edu (jim golden)
Subject: Re: Statistics vs NN (was Re: Newbie question on stock prediction)
Message-ID: <goldenjb-141194121351@129.59.170.62>
Followup-To: comp.ai.neural-nets
Sender: news@news.vanderbilt.edu
Nntp-Posting-Host: 129.59.170.62
Organization: vanderbilt
References: <39b9in$of0@omnifest.uwm.edu> <39rhlg$pmj@maui.cs.ucla.edu> <39s6p3$jld@Radon.Stanford.EDU> <39tu9q$am1@maui.cs.ucla.edu> <39umkb$kff@Radon.Stanford.EDU>
Distribution: na
Date: Mon, 14 Nov 1994 16:55:54 GMT
Lines: 26

In article <39umkb$kff@Radon.Stanford.EDU>, drakop@Xenon.Stanford.EDU (John
Andrew Drakopoulos) wrote:


> 
> Now, going back to the subject I include below a list of references that
> cover most of what I have posted in this newsgroup.
> I hope you would study them carefully before you make any further
> statement on the subject.
> 
> Regards,
> 
> John.
selected refs...

I'm sorry that I missed this original thread.  I only saw the recent three
posts, but whenever I see "Kolmogorov" my antenna go up.  Could someone
enlighten me on this discussion?  Mr. Drakopoulus' reference list is an
almost exact duplicate of my dissertation references, so I am curious about
the subject.  I've spent a lot of time working on the problem of NN as
universal function approximators and I am always interested in more
dialogue.  Could I get a summary of this discussion and perhaps I'll wade
in?

Jim Golden
goldenjb@ctrvax.vanderbilt.edu
