Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!gatech2!news.mathworks.com!uunet!in2.uu.net!halon!sybase!usenet
From: George Van Treeck <treeck@sybase.com>
Subject: Re: Guidelines to no. of neurons
Content-Type: text/plain; charset=us-ascii
Message-ID: <30C5E9A1.496B@sybase.com>
Sender: usenet@sybase.com
Content-Transfer-Encoding: 7bit
Organization: Sybase, Inc.
References: <49h9mm$t3j@nuscc.nus.sg> <49upf6$kdt@fstgal00.tu-graz.ac.at> <4a0drv$cd8@ixnews3.ix.netcom.com>
Mime-Version: 1.0
Date: Wed, 6 Dec 1995 19:06:09 GMT
X-Mailer: Mozilla 2.0b2 (X11; I; SunOS 5.3 sun4m)
Lines: 33

Jive Dadson wrote:
> 
> I too have been frustrated by the lack of a reasonable answer to this
> question. How many neurons is just right?
> 
> With other statistical-based estimators, you can at least
> fall back on a least-squares cross-validation scheme, but for even
> moderate sized neural nets that would take trans-geological amounts of
> time. The fact that sizes of all layers must be computed blows it
> up even more. On the face of it, the problem looks intractable.
> 
> Until the question can be answered with some rigor, I'm a little
> bit leary of neural net technology. I am worried by the fact that with
> Parzen kernel estimators, the window size is absolutely critical, and
> depends not only on the size of the training set, but also on the
> first and second derivatives of the function being estimated. The
> number of neurons in a net would seem to be an analogous "smoothing
> factor". How do we know that the neuron count is not just as critical
> to a neural net as the window size is to a kernel based estimator?
> 
>        Jive

For multilayer NNs, the number of inner layer neural nets
should equal the number of salient classes of features in the
input patterns.  If you don't know the number of feature-classes,
then I recommend using a self-organizing net for the inner layer.
It will determine the number of feature classes for you. And
its learning time is less sensitive to the number of neurons
in the layer. You can then use backprop for the output layer.
The number of neurons in the output layer depends on how many
unique outputs you want.

-George
