Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!gatech!news.mathworks.com!uunet!in2.uu.net!news.interpath.net!sas!newshost.unx.sas.com!hotellng.unx.sas.com!saswss
From: saswss@unx.sas.com (Warren Sarle)
Subject: changes to "comp.ai.neural-nets FAQ" -- monthly posting
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <nn.changes.posting_830746833@hotellng.unx.sas.com>
Supersedes: <nn.changes.posting_828072038@hotellng.unx.sas.com>
Date: Mon, 29 Apr 1996 03:00:34 GMT
Expires: Mon, 3 Jun 1996 03:00:33 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
Reply-To: saswss@unx.sas.com (Warren Sarle)
Organization: SAS Institute Inc., Cary, NC, USA
Keywords: modifications, new, additions, deletions
Followup-To: comp.ai.neural-nets
Lines: 383

==> nn1.changes.body <==

==> nn2.changes.body <==
*** nn2.oldbody	Thu Mar 28 23:00:16 1996
--- nn2.body	Sun Apr 28 23:00:13 1996
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part2
! Last-modified: 1996-03-28
  URL: ftp://ftp.sas.com/pub/neural/FAQ2.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part2
! Last-modified: 1996-04-05
  URL: ftp://ftp.sas.com/pub/neural/FAQ2.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 367,370 ****
--- 367,380 ----
  normalize/standardize/rescale the data?"). 
  
+ If you are using weight decay, you want to make sure that shrinking the
+ weights toward zero biases ('bias' in the statistical sense) the net in a
+ sensible, usually smooth, way. If you use 1 of C-1 coding for an input,
+ weight decay biases the output for the C-1 categories towards the output for
+ the 1 omitted category, which is probably not what you want, although there
+ might be special cases where it would make sense. If you use 1 of C coding
+ for an input, weight decay biases the output for all C categories roughly
+ towards the mean output for all the categories, which is smoother and
+ usually a reasonable thing to do. 
+ 
  Now consider ordered categories. For inputs, some people recommend a
  "thermometer code" like this: 
***************
*** 607,617 ****
  
  Jitter is artificial noise deliberately added to the inputs during training.
! Training with jitter is closely related to regularization methods such as
! weight decay and ridge regression. It is also a form of smoothing related to
! kernel regression (see "What is GRNN?") 
  
  Training with jitter works because the functions that we want NNs to learn
  are mostly smooth. NNs can learn functions with discontinuities, but the
! discontinuities must be restricted to sets of measure zero if our network is
  restricted to a finite number of hidden units. 
  
--- 617,627 ----
  
  Jitter is artificial noise deliberately added to the inputs during training.
! Training with jitter is a form of smoothing related to kernel regression
! (see "What is GRNN?"). It is also closely related to regularization methods
! such as weight decay and ridge regression (see "What is weight decay?"). 
  
  Training with jitter works because the functions that we want NNs to learn
  are mostly smooth. NNs can learn functions with discontinuities, but the
! functions must be continuous in a finite number of regions if our network is
  restricted to a finite number of hidden units. 
  
***************
*** 626,629 ****
--- 636,654 ----
  effect (Koistinen and Holmstro\"m 1992). 
  
+ Consider any point in the input space, not necessarily one of the original
+ training cases. That point could possibly arise as a jittered input as a
+ result of jittering any of several of the original neighboring training
+ cases. The average target value at the given input point will be a weighted
+ average of the target values of the original training cases. For an infinite
+ number of jittered cases, the weights will be proportional to the
+ probability densities of the jitter distribution, located at the original
+ training cases and evaluated at the given input point. Thus the average
+ target values given an infinite number of jittered cases will, by
+ definition, be the Nadaraya-Watson kernel regression estimator using the
+ jitter density as the kernel. Hence, training with jitter is an
+ approximation to training on the kernel regression estimator. And choosing
+ the amount (variance) of jitter is equivalent to choosing the bandwidth of
+ the kernel regression estimator (Scott 1992). 
+ 
  When studying nonlinear models such as feedforward NNs, it is often helpful
  first to consider what happens in linear models, and then to see what
***************
*** 760,763 ****
--- 785,790 ----
     backpropagation training with noise," NIPS4, 1033-1039. 
  
+    Scott, D.W. (1992) Multivariate Density Estimation, Wiley. 
+ 
     Vinod, H.D. and Ullah, A. (1981) Recent Advances in Regression Methods,
     NY: Marcel-Dekker. 
***************
*** 809,813 ****
     train to convergence, then go back and see which iteration had the lowest
     validation error. For more elaborate algorithms, see section 3.3 of 
!    /pub/papers/techreports/1994/1994-21.ps.gz. 
  
  Statisticians tend to be skeptical of stopped training because it appears to
--- 836,840 ----
     train to convergence, then go back and see which iteration had the lowest
     validation error. For more elaborate algorithms, see section 3.3 of 
!    ftp://ftp.ira.uka.de/pub/papers/techreports/1994/1994-21.ps.gz. 
  
  Statisticians tend to be skeptical of stopped training because it appears to

==> nn3.changes.body <==

==> nn4.changes.body <==

==> nn5.changes.body <==
*** nn5.oldbody	Thu Mar 28 23:00:29 1996
--- nn5.body	Sun Apr 28 23:00:23 1996
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part5
! Last-modified: 1996-03-06
  URL: ftp://ftp.sas.com/pub/neural/FAQ5.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part5
! Last-modified: 1996-04-27
  URL: ftp://ftp.sas.com/pub/neural/FAQ5.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 52,68 ****
  17. Multi-Module Neural Computing Environment (MUME) 
  18. LVQ_PAK, SOM_PAK 
! 19. SESAME 
! 20. Nevada Backpropagation (NevProp) 
! 21. Fuzzy ARTmap 
! 22. PYGMALION 
! 23. Basis-of-AI-backprop 
! 24. Matrix Backpropagation 
! 25. WinNN 
! 26. BIOSIM 
! 27. The Brain 
! 28. FuNeGen 
! 29. NeuDL -- Neural-Network Description Language 
! 30. NeoC Explorer 
! 31. AINET 
  
  Here are the full descriptions and references: 
--- 52,67 ----
  17. Multi-Module Neural Computing Environment (MUME) 
  18. LVQ_PAK, SOM_PAK 
! 19. Nevada Backpropagation (NevProp) 
! 20. Fuzzy ARTmap 
! 21. PYGMALION 
! 22. Basis-of-AI-backprop 
! 23. Matrix Backpropagation 
! 24. WinNN 
! 25. BIOSIM 
! 26. The Brain 
! 27. FuNeGen 
! 28. NeuDL -- Neural-Network Description Language 
! 29. NeoC Explorer 
! 30. AINET 
  
  Here are the full descriptions and references: 
***************
*** 309,340 ****
     programs to be used with SOM_PAK and LVQ_PAK can be found in /pub/utils).
  
! 19. SESAME
! ++++++++++
! 
!    ("Software Environment for the Simulation of Adaptive Modular Systems")
!    SESAME is a prototypical software implementation which facilitates 
!     o Object-oriented building blocks approach. 
!     o Contains a large set of C++ classes useful for neural nets,
!       neurocontrol and pattern recognition. No C++ classes can be used as
!       stand alone, though! 
!     o C++ classes include CartPole, nondynamic two-robot arms, Lunar Lander,
!       Backpropagation, Feature Maps, Radial Basis Functions, TimeWindows,
!       Fuzzy Set Coding, Potential Fields, Pandemonium, and diverse utility
!       building blocks. 
!     o A kernel which is the framework for the C++ classes and allows
!       run-time manipulation, construction, and integration of arbitrary
!       complex and hybrid experiments. 
!     o Currently no graphic interface for construction, only for
!       visualization. 
!     o Platform is SUN4, XWindows 
!    Unfortunately no reasonable good introduction has been written until now.
!    We hope to have something soon. For now we provide papers (eg. NIPS-92),
!    a reference manual (>220 pages), source code (ca. 35.000 lines of code),
!    and a SUN4-executable by ftp only. Sesame and its description is
!    available in various files for anonymous ftp on ftp ftp.gmd.de in the
!    directories /gmd/as/sesame and /gmd/as/paper. Questions to
!    sesame-request@gmd.de; there is only very limited support available. 
! 
! 20. Nevada Backpropagation (NevProp)
  ++++++++++++++++++++++++++++++++++++
  
--- 308,312 ----
     programs to be used with SOM_PAK and LVQ_PAK can be found in /pub/utils).
  
! 19. Nevada Backpropagation (NevProp)
  ++++++++++++++++++++++++++++++++++++
  
***************
*** 378,382 ****
     Research. 
  
! 21. Fuzzy ARTmap
  ++++++++++++++++
  
--- 350,354 ----
     Research. 
  
! 20. Fuzzy ARTmap
  ++++++++++++++++
  
***************
*** 385,389 ****
     (44 kB). 
  
! 22. PYGMALION
  +++++++++++++
  
--- 357,361 ----
     (44 kB). 
  
! 21. PYGMALION
  +++++++++++++
  
***************
*** 394,398 ****
     imag.imag.fr: archive/pygmalion/pygmalion.tar.Z). 
  
! 23. Basis-of-AI-backprop
  ++++++++++++++++++++++++
  
--- 366,370 ----
     imag.imag.fr: archive/pygmalion/pygmalion.tar.Z). 
  
! 22. Basis-of-AI-backprop
  ++++++++++++++++++++++++
  
***************
*** 417,421 ****
     Tveter; 5228 N. Nashville Ave.; Chicago, Illinois 60656; drt@mcs.com 
  
! 24. Matrix Backpropagation
  ++++++++++++++++++++++++++
  
--- 389,393 ----
     Tveter; 5228 N. Nashville Ave.; Chicago, Illinois 60656; drt@mcs.com 
  
! 23. Matrix Backpropagation
  ++++++++++++++++++++++++++
  
***************
*** 433,437 ****
     (anguita@dibe.unige.it). 
  
! 25. WinNN
  +++++++++
  
--- 405,409 ----
     (anguita@dibe.unige.it). 
  
! 24. WinNN
  +++++++++
  
***************
*** 449,453 ****
     ftp.cc.monash.edu.au as /pub/win3/programr/winnn97.zip (747 kB). 
  
! 26. BIOSIM
  ++++++++++
  
--- 421,425 ----
     ftp.cc.monash.edu.au as /pub/win3/programr/winnn97.zip (747 kB). 
  
! 25. BIOSIM
  ++++++++++
  
***************
*** 484,488 ****
     0621-60-21372; fax 0621-60-43735 
  
! 27. The Brain
  +++++++++++++
  
--- 456,460 ----
     0621-60-21372; fax 0621-60-43735 
  
! 26. The Brain
  +++++++++++++
  
***************
*** 507,511 ****
     perkovic@cleese.apana.org.au 
  
! 28. FuNeGen 1.0
  +++++++++++++++
  
--- 479,483 ----
     perkovic@cleese.apana.org.au 
  
! 27. FuNeGen 1.0
  +++++++++++++++
  
***************
*** 517,521 ****
     Saman K. Halgamuge 
  
! 29. NeuDL -- Neural-Network Description Language
  ++++++++++++++++++++++++++++++++++++++++++++++++
  
--- 489,493 ----
     Saman K. Halgamuge 
  
! 28. NeuDL -- Neural-Network Description Language
  ++++++++++++++++++++++++++++++++++++++++++++++++
  
***************
*** 539,543 ****
     (jrogers@buster.eng.ua.edu). 
  
! 30. NeoC Explorer (Pattern Maker included)
  ++++++++++++++++++++++++++++++++++++++++++
  
--- 511,515 ----
     (jrogers@buster.eng.ua.edu). 
  
! 29. NeoC Explorer (Pattern Maker included)
  ++++++++++++++++++++++++++++++++++++++++++
  
***************
*** 551,555 ****
     /SimTel/msdos/neurlnet/neocog10.zip (193 kB, DOS version) 
  
! 31. AINET
  +++++++++
  
--- 523,527 ----
     /SimTel/msdos/neurlnet/neocog10.zip (193 kB, DOS version) 
  
! 30. AINET
  +++++++++
  
***************
*** 572,576 ****
  If you are using a small computer (PC, Mac, etc.) you may want to have a
  look at the Central Neural System Electronic Bulletin Board (see question 
! "Other sources of information"). Modem: 409-737-5312; Sysop: Wesley R.
  Elsberry; 4160 Pirates' Beach, Galveston, TX, USA; welsberr@orca.tamu.edu.
  There are lots of small simulator packages, the CNS ANNSIM file set. There
--- 544,548 ----
  If you are using a small computer (PC, Mac, etc.) you may want to have a
  look at the Central Neural System Electronic Bulletin Board (see question 
! "Other sources of information"). Modem: 409-737-5222; Sysop: Wesley R.
  Elsberry; 4160 Pirates' Beach, Galveston, TX, USA; welsberr@orca.tamu.edu.
  There are lots of small simulator packages, the CNS ANNSIM file set. There
***************
*** 581,584 ****
  ------------------------------------------------------------------------
  
! Next part is part 6 (of 7). Previous part is part 4. @
  
--- 553,556 ----
  ------------------------------------------------------------------------
  
! Next part is part 6 (of 7). Previous part is part 4. 
  

==> nn6.changes.body <==

==> nn7.changes.body <==
*** nn7.oldbody	Thu Mar 28 23:00:37 1996
--- nn7.body	Sun Apr 28 23:00:31 1996
***************
*** 422,425 ****
--- 422,426 ----
   o Gamze Erten <ictech@mcimail.com> 
   o Ed Rosenfeld <IER@aol.com> 
+  o Javier Blasco-Alberto <jblasco@ideafix.cps.unizar.es> 
   o Jean-Denis Muller <jdmuller@vnet.ibm.com> 
   o Jeff Harpster <uu0979!jeff@uu9.psi.com> 
-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
