Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!news.duq.edu!newsgate.duke.edu!agate!howland.erols.net!cam-news-hub1.bbnplanet.com!nntp-hub2.barrnet.net!nntp-hub3.barrnet.net!voder!nsc!taux01!usenet
From: Marcelo Krygier <marcelo@taux01.nsc.com>
Subject: Overfit concept does not fit
X-Nntp-Posting-Host: tasu24
Content-Type: text/plain; charset=us-ascii
Message-ID: <322AE403.2781E494@taux01.nsc.com>
Sender: usenet@taux01.nsc.com (Usenet news account)
Content-Transfer-Encoding: 7bit
Organization: National Semiconductor I.C.
Mime-Version: 1.0
Date: Mon, 2 Sep 1996 13:41:23 GMT
X-Mailer: Mozilla 3.0 (X11; I; SunOS 4.1.3 sun4m)
Lines: 13

You all know NNs can be overtrained/overfitted.
It should be pointing to some basic problem with the NNs models
we use. Our neurons, being the model everyone tries to simulate in
ANNs, work BETTER when shown more examples. ANNs got their weights
screwed up when overtrained.
Can anybody explain to me how this fits the ANN model ?


-- 
---------------------------------------------------------------------
 Marcelo Krygier                      @email : marcelo@taux01.nsc.com
                                      Tel    : (972) 9 594210
---------------------------------------------------------------------
