Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!nntp.sei.cmu.edu!news.psc.edu!hudson.lm.com!godot.cc.duq.edu!news.duke.edu!news.mathworks.com!tank.news.pipex.net!pipex!news.sprintlink.net!redstone.interpath.net!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: uiwiley.c
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <DEM0Lo.JAD@unx.sas.com>
Date: Fri, 8 Sep 1995 23:15:24 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References:  <zhongweiz.50.30450A7F@gscit-1.fcit.monash.edu.au>
Organization: SAS Institute Inc.
Lines: 71


In article <zhongweiz.50.30450A7F@gscit-1.fcit.monash.edu.au>, zhongweiz@gscit-1.fcit.monash.edu.au (Zhongwei ZHANG) writes:
|> A book, Neural netowrk and fuzzy logic applications in c/c++, written by
|> Stephen T. Welstead, is great. I really enjoy reading it.  In it, you can find
|> a lot of useful code.

Welstead's book sucks. Try this one instead:

   Masters, T. (1993), _Practical Neural Network Recipes in C++_,
   Academic Press.

Wiley's two NN C++ books are both atrocious:

   Blum, Adam (1992), _Neural Networks in C++_, Wiley.

   Welstead, Stephen T. (1994), _Neural Network and Fuzzy Logic
   Applications in C/C++_, Wiley.

Both Blum and Welstead contribute to the dangerous myth that any
idiot can use a neural net by dumping in whatever data are handy
and letting it train for a few days. They both have little or no
discussion of generalization, validation, and overfitting. Neither
provides any valid advice on choosing the number of hidden nodes.
If you have ever wondered where these stupid "rules of thumb" that
pop up frequently come from, here's a source for one of them:

   "A rule of thumb is for the size of this [hidden] layer to be
   somewhere between the input layer size ... and the output layer
   size ..."  Blum, p. 60.

Blum offers some profound advice on choosing inputs:

   "The next step is to pick as many input factors as possible that
   might be related to [the target]."

Blum also shows a deep understanding of statistics:

   "A statistical model is simply a more indirect way of learning
   correlations. With a neural net approach, we model the problem
   directly." p. 8.

(BTW, I had concluded that Blum's book was worthless long before I
found _that_ gem.)

Blum at least mentions some important issues, however simplistic his
advice may be. Welstead just ignores them. What Welstead gives you is
code--vast amounts of code. I have no idea how anyone could write _that_
much code for a simple feedforward NN.  Welstead's approach to
validation, in his chapter on financial forecasting, is to reserve
_two_ cases for the validation set!

There is another C++ NN book by Rao & Rao, but it has very little
material on feedforward NNs, which account for the majority of
practical applications.

My comments apply only to the _text_ of the above books. I have not
examined or attempted to compile the code.

--

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.



-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
