Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!udel!darwin.sura.net!news.Vanderbilt.Edu!NewsWatcher!user
From: goldenjb@ctrvax.vanderbilt.edu (jim golden)
Subject: Re: "Trading... - Oh... really?" Part II (last)
Message-ID: <goldenjb-281094120415@129.59.170.62>
Followup-To: comp.ai.neural-nets
Sender: news@news.vanderbilt.edu
Nntp-Posting-Host: 129.59.170.62
Organization: vanderbilt
References: <83591.arcon@dial.illinois.net> <goldenjb-261094142042@129.59.170.62> <1994Oct27.115504.26956@oxvaxd>
Date: Fri, 28 Oct 1994 17:01:53 GMT
Lines: 28

In article <1994Oct27.115504.26956@oxvaxd>, hsr4@vax.oxford.ac.uk (Blue
Peter) wrote:


> 
> Perhaps you might have added YMMV, because mine certainly does.  A year ago
> I produced a neural net sales predictor for a retail chain in the UK.  I had
> (and still have) no real knowledge of the chain's operation, and the parameters
> were provided by a third party using a standard assessment protocol, from which 


Ah, but you see you do have statistical knowledge which enabled you to
build a training set of some value.  And you got a net that works.  How
often does it work?  What architecture did you use?  Why?  How many hidden
layers?  Why?  What function were you approximating?  How fast does it run?
 How well does the net genralize?  Prove it.  I understand the
non-disclosure part, the questions are rhetorical.  But if you have
knowledge of statistics, why did you use a NN at all?  Why not some easily
implementable statistical method?  My major gripe is that people who use
NNs without understanding how a network works or why a particular
architecture was chosen do not get an answer they can justify.  Neural
networks are very combinatorial entities and every parameter effects
performance, from momentum to number of layers to squashing function.  I
believe you should be able to justify every architectural parameter.  

Jim

P.S.  what does YMMV mean?
