Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!nntp.sei.cmu.edu!news.cis.ohio-state.edu!news.maxwell.syr.edu!cam-news-hub1.bbnplanet.com!news.bbnplanet.com!howland.erols.net!worldnet.att.net!news.mathworks.com!newsgate.duke.edu!interpath!news.interpath.net!news.interpath.net!sas!newshost.unx.sas.com!hotellng.unx.sas.com!saswss
From: saswss@unx.sas.com (Warren Sarle)
Subject: changes to "comp.ai.neural-nets FAQ" -- monthly posting
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <nn.changes.posting_857188841@hotellng.unx.sas.com>
Supersedes: <nn.changes.posting_854510445@hotellng.unx.sas.com>
Date: Sat, 1 Mar 1997 04:00:42 GMT
Expires: Sat, 5 Apr 1997 04:00:41 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
Reply-To: saswss@unx.sas.com (Warren Sarle)
Organization: SAS Institute Inc., Cary, NC, USA
Keywords: modifications, new, additions, deletions
Followup-To: comp.ai.neural-nets
Lines: 537

==> nn1.changes.body <==
*** nn1.oldbody	Tue Jan 28 23:00:19 1997
--- nn1.body	Fri Feb 28 23:00:11 1997
***************
*** 644,647 ****
  ------------------------------------------------------------------------
  
! Next part is part 2 (of 7). @
  
--- 644,647 ----
  ------------------------------------------------------------------------
  
! Next part is part 2 (of 7). 
  

==> nn2.changes.body <==
*** nn2.oldbody	Tue Jan 28 23:00:24 1997
--- nn2.body	Fri Feb 28 23:00:17 1997
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part2
! Last-modified: 1997-01-24
  URL: ftp://ftp.sas.com/pub/neural/FAQ2.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part2
! Last-modified: 1997-02-28
  URL: ftp://ftp.sas.com/pub/neural/FAQ2.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 23,26 ****
--- 23,30 ----
     What is the curse of dimensionality?
     How do MLPs compare with RBFs?
+       Hybrid training and the curse of dimensionality
+       Additive inputs
+       Redundant inputs
+       Irrelevant inputs
     What are OLS and subset regression?
     Should I normalize/standardize/rescale the data?
***************
*** 749,753 ****
  While the distinction between these two types of Gaussian RBF architectures
  is sometimes mentioned in the NN literature, its importance has rarely been
! appreciated except by Tao (1993) and Werntges (1993). 
  
  There are several subtypes of both ORBF and NRBF architectures. Descriptions
--- 753,759 ----
  While the distinction between these two types of Gaussian RBF architectures
  is sometimes mentioned in the NN literature, its importance has rarely been
! appreciated except by Tao (1993) and Werntges (1993). Shorten and
! Murray-Smith (1996) also compare ordinary and normalized Gaussian RBF
! networks. 
  
  There are several subtypes of both ORBF and NRBF architectures. Descriptions
***************
*** 840,845 ****
  hidden-to-output weights are within a reasonable range (such as the range of
  the target values), you can be sure that the outputs will be within that
! same range for all possible inputs, even when the net is extrapolating. 
  
  Radial combination functions incorporating altitudes are useful with NRBF
  architectures. The NRBF architectures combine some of the virtues of both
--- 846,860 ----
  hidden-to-output weights are within a reasonable range (such as the range of
  the target values), you can be sure that the outputs will be within that
! same range for all possible inputs, even when the net is extrapolating. No
! comparably useful bound exists for the output of an ORBF network. 
  
+ If you extrapolate far enough in a Gaussian ORBF network with an identity
+ output activation function, the activation of every hidden unit will
+ approach zero, hence the extrapolated output of the network will equal the
+ output bias. If you extrapolate far enough in an NRBF network, one hidden
+ unit will come to dominate the output. Hence if you want the network to
+ extrapolate different values in a different directions, an NRBF should be
+ used instead of an ORBF. 
+ 
  Radial combination functions incorporating altitudes are useful with NRBF
  architectures. The NRBF architectures combine some of the virtues of both
***************
*** 860,864 ****
  piecewise linear for RBF units with equal widths, and approximately
  piecewise spherical for RBF units with unequal widths. The larger the
! widths, the smoother the isoactivation contours where the pices join. 
  
  The NRBFEQ architecture is a smoothed variant of the learning vector
--- 875,882 ----
  piecewise linear for RBF units with equal widths, and approximately
  piecewise spherical for RBF units with unequal widths. The larger the
! widths, the smoother the isoactivation contours where the pieces join. As
! Shorten and Murray-Smith (1996) point out, the activation is not necessarily
! a monotone function of distance from the center when unequal widths are
! used. 
  
  The NRBFEQ architecture is a smoothed variant of the learning vector
***************
*** 1025,1034 ****
  hidden units required grows exponentially with the number of inputs,
  regardless of how many inputs are relevant. This exponential growth is
! related to the fact that RBFs and ERBFs have local receptive fields, meaning
! that changing the hidden-to-output weights of a given unit will affect the
! output of the network only in a neighborhood of the center of the hidden
! unit, where the size of the neighborhood is determined by the width of the
! hidden unit. (Of course, if the width of the unit is learned, the receptive
! field could grow to cover the entire training set.) 
  
  Local receptive fields are often an advantage compared to the distributed
--- 1043,1052 ----
  hidden units required grows exponentially with the number of inputs,
  regardless of how many inputs are relevant. This exponential growth is
! related to the fact that ORBFs have local receptive fields, meaning that
! changing the hidden-to-output weights of a given unit will affect the output
! of the network only in a neighborhood of the center of the hidden unit,
! where the size of the neighborhood is determined by the width of the hidden
! unit. (Of course, if the width of the unit is learned, the receptive field
! could grow to cover the entire training set.) 
  
  Local receptive fields are often an advantage compared to the distributed
***************
*** 1096,1101 ****
  References:
  There are few good references on RBF networks. Bishop (1995) gives one of
! the better surveys, but also see Tao (1993) for the importance of
! normalization. 
  
     Bishop, C.M. (1995), Neural Networks for Pattern Recognition, Oxford:
--- 1114,1119 ----
  References:
  There are few good references on RBF networks. Bishop (1995) gives one of
! the better surveys, but also see Tao (1993) and Werntges (1993) for the
! importance of normalization. 
  
     Bishop, C.M. (1995), Neural Networks for Pattern Recognition, Oxford:
***************
*** 1140,1143 ****
--- 1158,1165 ----
     1551-1573, ftp://ftp.sas.com/pub/neural/neural2.ps. 
  
+    Shorten, R., and Murray-Smith, R. (1996), "Side effects of normalising
+    radial basis function networks" International Journal of Neural Systems,
+    7, 167-179. 
+ 
     Tao, K.M. (1993), "A closer look at the radial basis function (RBF)
     networks," Conference Record of The Twenty-Seventh Asilomar
***************
*** 2100,2103 ****
--- 2122,2129 ----
  Please see the comp.ai.genetic FAQ for further information. 
  
+ Andrew Gray's Hybrid Systems FAQ at the University of Otago at 
+ http://divcom.otago.ac.nz:800/COM/INFOSCI/SMRL/people/andrew/publications/faq/hybrid/hybrid.htm,
+ also has links to information on neuro-genetic methods. 
+ 
  ------------------------------------------------------------------------
  
***************
*** 2164,2167 ****
--- 2190,2195 ----
   o Marcello Chiaberge's Neuro-Fuzzy page at 
     http://polimage.polito.it/~marcello. 
+  o Andrew Gray's Hybrid Systems FAQ at the University of Otago at 
+    http://divcom.otago.ac.nz:800/COM/INFOSCI/SMRL/people/andrew/publications/faq/hybrid/hybrid.htm
  
  References: 

==> nn3.changes.body <==
*** nn3.oldbody	Tue Jan 28 23:00:29 1997
--- nn3.body	Fri Feb 28 23:00:22 1997
***************
*** 96,100 ****
  can also use such information to choose the training cases more efficiently.
  For example, with a linear model, you should choose training cases at the
! outer limits of the input space instead of evenly distributing them out
  throughout the input space. 
  
--- 96,100 ----
  can also use such information to choose the training cases more efficiently.
  For example, with a linear model, you should choose training cases at the
! outer limits of the input space instead of evenly distributing them
  throughout the input space. 
  

==> nn4.changes.body <==
*** nn4.oldbody	Tue Jan 28 23:00:33 1997
--- nn4.body	Fri Feb 28 23:00:26 1997
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part4
! Last-modified: 1996-12-30
  URL: ftp://ftp.sas.com/pub/neural/FAQ4.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part4
! Last-modified: 1997-02-28
  URL: ftp://ftp.sas.com/pub/neural/FAQ4.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 22,25 ****
--- 22,36 ----
     Other sources of information about NNs?
     Databases for experimentation with NNs?
+       The neural-bench Benchmark collection
+       Proben1
+       UCI machine learning database
+       NIST special databases of the National Institute Of Standards And
+       Technology:
+       CEDAR CD-ROM 1: Database of Handwritten Cities, States, ZIP Codes,
+       Digits, and Alphabetic Characters
+       AI-CD-ROM
+       Time series archive
+       USENIX Faces
+       Lloyd Lubet's Financial, Business, and Economic Data Warehouse
  
  Part 5: Free software
***************
*** 652,655 ****
--- 663,688 ----
           (Note: Remarks are from journal announcement)
  
+ Title:   IEEE Transactions on Evolutionary Computation
+ Publish: Institute of Electrical and Electronics Engineers (IEEE)
+ Address: IEEE Service Cemter, 445 Hoes Lane, P.O. Box 1331, Piscataway, NJ,
+          08855-1331 USA. Tel: (201) 981-0060
+ Cost/Yr: $10 for Members belonging to participating IEEE societies
+ Freq.:   Quarterly (vol. 1 in May 1997)
+ URL:     http://engine.ieee.org/nnc/pubs/transactions.html
+ Remark:  The IEEE Transactions on Evolutionary Computation will publish archival
+          journal quality original papers in evolutionary computation and related
+          areas, with particular emphasis on the practical application of the
+          techniques to solving real problems in industry, medicine, and other
+          disciplines.  Specific techniques include but are not limited to
+          evolution strategies, evolutionary programming, genetic algorithms, and
+          associated methods of genetic programming and classifier systems.  Papers
+          emphasizing mathematical results should ideally seek to put these results
+          in the context of algorithm design, however purely theoretical papers will
+          be considered.  Other papers in the areas of cultural algorithms, artificial
+          life, molecular computing, evolvable hardware, and the use of simulated
+          evolution to gain a better understanding of naturally evolved systems are
+          also encouraged.
+          (Note: Remarks are from journal CFP)
+ 
  Title:   International Journal of Neural Systems
  Publish: World Scientific Publishing
***************
*** 1111,1116 ****
     (in digest form) dealing with all aspects of neural networks (and any
     type of network or neuromorphic system)" To subscribe, send email to
!    neuron-request@cattell.psych.upenn.edu comp.ai.neural-net readers also
!    find the messages in that newsgroup in the form of digests. 
  
  3. Usenet groups comp.ai.neural-nets (Oha!) and
--- 1144,1151 ----
     (in digest form) dealing with all aspects of neural networks (and any
     type of network or neuromorphic system)" To subscribe, send email to
!    neuron-request@psych.upenn.edu. The ftp archives (including back issues)
!    are available from psych.upenn.edu in pub/Neuron-Digest or by sending
!    email to "archive-server@psych.upenn.edu". comp.ai.neural-net readers
!    also find the messages in that newsgroup in the form of digests. 
  
  3. Usenet groups comp.ai.neural-nets (Oha!) and
***************
*** 1205,1218 ****
     In World-Wide-Web (WWW, for example via the xmosaic program) you can read
     neural network information for instance by opening one of the following
!    uniform resource locators (URLs): http://www.neuronet.ph.kcl.ac.uk
!    (NEuroNet, King's College, London), http://www.eeb.ele.tue.nl (Eindhoven,
!    Netherlands), http://www.emsl.pnl.gov:2080/docs/cie/neural/ (Richland,
!    Washington), http://www.cosy.sbg.ac.at/~rschwaig/rschwaig/projects.html
!    (Salzburg, Austria), http://http2.sils.umich.edu/Public/nirg/nirg1.html
!    (Michigan), http://www.lpac.ac.uk/SEL-HPC/Articles/NeuralArchive.html
!    (London), http://rtm.science.unitn.it/ Reactive Memory Search (Tabu
!    Search) page (Trento, Italy), http://www.wi.leidenuniv.nl/art/ (ART WWW
!    site, Leiden, Netherlands), http://nucleus.hut.fi/nnrc/ Helsinki
!    University of Technology. 
     Many others are available too; WWW is changing all the time. 
  
--- 1240,1257 ----
     In World-Wide-Web (WWW, for example via the xmosaic program) you can read
     neural network information for instance by opening one of the following
!    uniform resource locators (URLs): 
!    http://www.neuronet.ph.kcl.ac.uk (NEuroNet, King's College, London), 
!    http://www.eeb.ele.tue.nl (Eindhoven, Netherlands), 
!    http://www.emsl.pnl.gov:2080/docs/cie/neural/ (Richland, Washington), 
!    http://www.cosy.sbg.ac.at/~rschwaig/rschwaig/projects.html (Salzburg,
!    Austria), 
!    http://http2.sils.umich.edu/Public/nirg/nirg1.html (Michigan), 
!    http://www.lpac.ac.uk/SEL-HPC/Articles/NeuralArchive.html (London), 
!    http://rtm.science.unitn.it/ Reactive Memory Search (Tabu Search) page
!    (Trento, Italy), 
!    http://www.wi.leidenuniv.nl/art/ (ART WWW site, Leiden, Netherlands), 
!    http://nucleus.hut.fi/nnrc/ Helsinki University of Technology. 
!    http://www.pitt.edu/~mattf/NeuroRing.html links to neuroscience web pages
! 
     Many others are available too; WWW is changing all the time. 
  
***************
*** 1459,1462 ****
--- 1498,1521 ----
     same subject are required. However, it is still useful for use as testing
     set on a trained network. 
+ 
+ 9. Lloyd Lubet's Financial, Business, and Economic Data
+ +++++++++++++++++++++++++++++++++++++++++++++++++++++++
+    Warehouse
+    +++++++++
+ 
+    The database consists of 23 tables containing 896 Monthly Indicators,
+    Indexes, and Statistics for over 21 years. 7 tables deal directly with
+    historic stock prices and market indexes. You can download this
+    information either as a zip or ascii text file. For easy access, the
+    database includes a table of contents and highly descriptive column
+    headers. 
+ 
+    Also available: White Paper #1: Avoiding the Pitfalls of high-tech
+    statistics. This is a beginner's guide completely illustrated with some
+    nice visualizations. It is accompanied by an 8 page notebook. 
+ 
+    URL: http://www.rt66.com/~llubet (Copyright 1996 Accelerated Systems,
+    Inc.)
+    Email: llubet@rt66.com 
  
     ------------------------------------------------------------------------

==> nn5.changes.body <==

==> nn6.changes.body <==
*** nn6.oldbody	Tue Jan 28 23:00:41 1997
--- nn6.body	Fri Feb 28 23:00:34 1997
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part6
! Last-modified: 1997-01-13
  URL: ftp://ftp.sas.com/pub/neural/FAQ6.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part6
! Last-modified: 1997-02-11
  URL: ftp://ftp.sas.com/pub/neural/FAQ6.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 61,65 ****
  12. Cortex-Pro 
  13. Partek 
! 14. NeuroSolutions v2.0 
  15. Qnet For Windows Version 2.0 
  16. NeuroLab, A Neural Network Library 
--- 61,65 ----
  12. Cortex-Pro 
  13. Partek 
! 14. NeuroSolutions v3.0 
  15. Qnet For Windows Version 2.0 
  16. NeuroLab, A Neural Network Library 
***************
*** 544,595 ****
     http://www.partek.com/ 
  
! 14. NeuroSolutions v2.0
  +++++++++++++++++++++++
  
!    NeuroSolutions is a graphical neural network simulation tool. It
!    supports trajectory learning with backpropagation through time.
!    Because of its object-oriented design, NeuroSolutions provides the
!    flexibility needed to construct a wide range of learning paradigms
!    and network topologies.  Its GUI and extensive probing ability
!    streamline the experimentation process by providing real-time
!    analysis of the network during learning.
! 
!    Construct any neural network belonging to the additive model,
!    including locally and globally recurrent systems.  Use a variety of
!    unsupervised learning procedures, such as Hebbian, Sanger's, Oja's,
!    Competitive and Kohonen.  Implement RBF, PCA, counterpropagation and
!    other hybrid network topologies by seamlessly integrating
!    supervised and unsupervised learning.
! 
!    During learning, animate changes of internal variables such as
!    activations, weights, sensitivities and gradients with a variety of
!    probes.  Examples are the oscilloscope, spectrum analyzer, 3D state
!    space, scatter, 3D surface, matrix and bitmap.
! 
!    NeuroSolutions'  NeuralWizard utility automates the neural network
!    design process.  Choose between a wide range of neural models. The
!    network parameters are dynamically adjusted based on the user's
!    training data.  It is a powerful tool used by both beginners and
!    researchers alike.
! 
!    NeuroSolutions offers advanced features to meet the integration
!    needs of neural network developers.  Once a system is designed and
!    simulated using the icon-based development environment,
!    NeuroSolutions will generate ANSI-compatible C++ source code to be
!    compiled and linked into custom applications.  NeuroSolutions can
!    also be customized through user-defined DLL's and OLE support.
! 
!    Price: $195 - $1995
! 
!    Demo copy available from company or by anonymous ftp:
!            ftp://oak.oakland.edu/SimTel/win3/neurlnet/ns2demo1.zip
!            ftp://oak.oakland.edu/SimTel/win3/neurlnet/ns2demo2.zip
  
             NeuroDimension, Inc.
!            720 S.W. 2nd Ave., Suite 458
!            Gainesville FL, 32601
     Phone:  (800) 634-3327 or
!            (904) 377-5144
!    FAX:    (904) 338-6779
     Email:  info@nd.com
     URL:    http://www.nd.com/
--- 544,613 ----
     http://www.partek.com/ 
  
! 14. NeuroSolutions v3.0
  +++++++++++++++++++++++
  
!    NeuroSolutions is a graphical neural network simulation tool. Because of
!    its object-oriented design, NeuroSolutions provides the flexibility
!    needed to construct a wide range of learning paradigms and network
!    topologies. Its GUI and extensive probing ability streamline the
!    experimentation process by providing real-time analysis of the network
!    during learning. 
! 
!    Download a free evaluation copy from 
!    -------------------------------------
!    http://www.nd.com/demo/demo.htm. 
! 
!    Topologies
!    ----------
! 
!     o Multilayer perceptrons (MLPs) 
!     o Generalized Feedforward networks 
!     o Modular networks 
!     o Jordan-Elman networks 
!     o Self-Organizing Feature Map (SOFM) networks 
!     o Radial Basis Function (RBF) networks 
!     o Time Delay Neural Networks (TDNN) 
!     o Time-Lag Recurrent Networks (TLRN) 
!     o User-defined network topologies 
! 
!    Learning Paradigms
!    ------------------
! 
!     o Backpropagation 
!     o Recurrent Backpropagation 
!     o Backpropagation through Time 
!     o Unsupervised Learning 
!        o Hebbian 
!        o Oja's 
!        o Sanger's 
!        o Competitive 
!        o Kohonen 
! 
!    Advanced Features
!    -----------------
! 
!     o ANSI C++ Source Code Generation 
!     o Customized Components through DLLs 
!     o Microsoft Excel Add-in -- NeuroSolutions for Excel 
!        o Visual Data Selection 
!        o Data Preprocessing and Analysis 
!        o Batch Training and Parameter Optimization 
!        o Sensitivity Analysis 
!        o Automated Report Generation 
!     o Comprehensive Macro Language 
!     o Fully accessible from any OLE-compliant application, such as: 
!        o Visual Basic 
!        o Microsoft Excel 
!        o Microsoft Access 
  
+    Price:  $195 - $1995 (educational discounts available)
+    O.S.:   Windows 95, Windows NT
+ 
             NeuroDimension, Inc.
!            1800 N. Main St., Suite D4
!            Gainesville FL, 32609
     Phone:  (800) 634-3327 or
!            (352) 377-5144
!    FAX:    (352) 377-9009
     Email:  info@nd.com
     URL:    http://www.nd.com/

==> nn7.changes.body <==
*** nn7.oldbody	Tue Jan 28 23:00:44 1997
--- nn7.body	Fri Feb 28 23:00:38 1997
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part7
! Last-modified: 1997-01-18
  URL: ftp://ftp.sas.com/pub/neural/FAQ7.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part7
! Last-modified: 1997-02-03
  URL: ftp://ftp.sas.com/pub/neural/FAQ7.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 151,164 ****
  ++++++++++++++++++++
  
!       4730 Walnut St., Suite 101B
!       Boulder, CO 80301
!       Voice: (303) 442-3539   Fax: (303) 442-2854
!       Internet: techsupport@ndx.com
!       NDX sells a number neural network hardware products:
!       NDX Neural Accelerators: a line of i860-based accelerator cards for
!       the PC that give up to 45 million connections per second for use
!       with the DynaMind neural network software.
!       iNNTS: Intel's 80170NX (ETANN) Neural Network Training System. NDX's president
!       was one of the co-designers of this chip.
  
  9. IC Tech, Inc.
--- 151,163 ----
  ++++++++++++++++++++
  
!       P.O. Box 14
!       Marion, OH  43301-0014
!       Voice (614) 387-5074  Fax: (614) 382-4533
!       Internet:  jwrogers@on-ramp.net
! 
!       InfoTech Software Engineering purchased the software and
!       trademarks from NeuroDynamX, Inc. and, using the NeuroDynamX tradename,
!       continues to publish the DynaMind, DynaMind Developer Pro and iDynaMind
!       software packages. 
  
  9. IC Tech, Inc.
***************
*** 455,458 ****
--- 454,458 ----
   o Thomas Lindblad <lindblad@kth.se> 
   o Clark Lindsey <lindsey@particle.kth.se> 
+  o Lloyd Lubet <llubet@rt66.com> 
   o William Mackeown <mackeown@compsci.bristol.ac.uk> 
   o Maria Dolores Soriano Lopez <maria@vaire.imib.rwth-aachen.de> 
-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
 *** Do not send me unsolicited commercial or political email! ***

