Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!newshost.marcam.com!uunet!EU.net!news.sprintlink.net!crash!mkppp.cts.com!user
From: Dean_Abbott@partech.com (dean abbott)
Subject: Re: No. of hidden neurons ...
Organization: pgsc
Date: Fri, 24 Feb 1995 17:48:46 GMT
Message-ID: <Dean_Abbott-2402950956090001@mkppp.cts.com>
References: <D44LrC.IM2@hkuxb.hku.hk> <1995Feb17.080704.26746@uxmail.ust.hk> <rrg.1125.000A3357@aber.ac.uk> <3ih1ad$q1p@maui.cs.ucla.edu> <3iisgo$oc1@usenet.INS.CWRU.Edu>
Sender: news@crash.cts.com (news subsystem)
Nntp-Posting-Host: mkppp.cts.com
Lines: 28

In article <3iisgo$oc1@usenet.INS.CWRU.Edu>, cc439@cleveland.Freenet.Edu
(Philip M. Kalina) wrote:

> In a previous article, edwin@maui.cs.ucla.edu (E. Robert Tisdale) says:
> 
> >rrg@aber.ac.uk (Roy Goodacre) writes:
> >
> >> It is important not to have too many nodes in the hidden layer because this
> >> may allow the neural network to learn by example only and not to generalize
> >> . . .                     - Roy Goodacre
> >
> >This is nonsense!  Nature doesn't restrict the number of hidden units
> >in natural neural networks and there is no reason why we should restrict
> >the number of hidden units in artificial neural networks  . . .  you
> >should have at least as many distinct, independent input/output pairs
> >in the training set as there are connection weights and biases in your
> >network.  Having 30 times as many is almost as good as having an infinite
> >number of such pairs.    . . .   Hope this clears things up, Bob Tisdale.
> 

Yes, but I'm sure nature doesn't learn via minimizing squared error,
taking every
example in the database as completely true. :)

-- 
PAR Government Systems Corp.     |
1010 Prospect St., Suite 200     |
La Jolla, CA 92037               |
