Newsgroups: comp.ai.neural-nets,comp.answers,news.answers
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!nntp.sei.cmu.edu!news.cis.ohio-state.edu!math.ohio-state.edu!cs.utexas.edu!swrinde!news-res.gsl.net!news.gsl.net!news.mathworks.com!zombie.ncsc.mil!newsgate.duke.edu!interpath!news.interpath.net!sas!newshost.unx.sas.com!hotellng.unx.sas.com!saswss
From: saswss@unx.sas.com (Warren Sarle)
Subject: comp.ai.neural-nets FAQ, Part 5 of 7: Free software
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <nn5.posting_838609229@hotellng.unx.sas.com>
Supersedes: <nn5.posting_836017229@hotellng.unx.sas.com>
Approved: news-answers-request@MIT.EDU
Date: Mon, 29 Jul 1996 03:00:30 GMT
Expires: Mon, 2 Sep 1996 03:00:29 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
Reply-To: saswss@unx.sas.com (Warren Sarle)
Organization: SAS Institute Inc., Cary, NC, USA
Keywords: frequently asked questions, answers
Followup-To: comp.ai.neural-nets
Lines: 622
Xref: glinda.oz.cs.cmu.edu comp.ai.neural-nets:32758 comp.answers:20108 news.answers:77872

Archive-name: ai-faq/neural-nets/part5
Last-modified: 1996-06-27
URL: ftp://ftp.sas.com/pub/neural/FAQ5.html
Maintainer: saswss@unx.sas.com (Warren S. Sarle)

This is part 5 (of 7) of a monthly posting to the Usenet newsgroup
comp.ai.neural-nets. See the part 1 of this posting for full information
what it is all about.

========== Questions ========== 
********************************

Part 1: Introduction
Part 2: Learning
Part 3: Generalization
Part 4: Books, data, etc.
Part 5: Free software

   Freely available software packages for NN simulation?

Part 6: Commercial software
Part 7: Hardware

------------------------------------------------------------------------

Subject: Freely available software packages for NN
==================================================
simulation?
===========

Note for future submissions: Please restrict yourself to 60 lines length.
Please send a HTML-formatted version if at all possible. 

The following simulators are described below: 

1. Rochester Connectionist Simulator 
2. UCLA-SFINX 
3. NeurDS 
4. PlaNet (formerly known as SunNet) 
5. GENESIS 
6. Mactivation 
7. Cascade Correlation Simulator 
8. Quickprop 
9. DartNet 
10. SNNS 
11. Aspirin/MIGRAINES 
12. Adaptive Logic Network Educational Kit 
13. NeuralShell 
14. PDP++ 
15. Uts (Xerion, the sequel) 
16. Neocognitron simulator 
17. Multi-Module Neural Computing Environment (MUME) 
18. LVQ_PAK, SOM_PAK 
19. Nevada Backpropagation (NevProp) 
20. Fuzzy ARTmap 
21. PYGMALION 
22. Basis-of-AI-NN Software 
23. Matrix Backpropagation 
24. WinNN 
25. BIOSIM 
26. The Brain 
27. FuNeGen 
28. NeuDL -- Neural-Network Description Language 
29. NeoC Explorer 
30. AINET 
31. DemoGNG 

See also http://www.emsl.pnl.gov:2080/docs/cie/neural/systems/shareware.html

1. Rochester Connectionist Simulator
++++++++++++++++++++++++++++++++++++

   A quite versatile simulator program for arbitrary types of neural nets.
   Comes with a backprop package and a X11/Sunview interface. Available via
   anonymous FTP from cs.rochester.edu in directory pub/packages/simulator
   as the files README (8 KB), rcs_v4.2.tar.Z (2.9 MB), 

2. UCLA-SFINX
+++++++++++++

   ftp retina.cs.ucla.edu [131.179.16.6]; Login name: sfinxftp; Password:
   joshua; directory: pub; files : README; sfinx_v2.0.tar.Z; Email info
   request : sfinx@retina.cs.ucla.edu 

3. NeurDS
+++++++++

   simulator for DEC systems supporting VT100 terminal. available for
   anonymous ftp from gatekeeper.dec.com [16.1.0.2] in directory: pub/DEC as
   the file NeurDS031.tar.Z (111 Kb) 

4. PlaNet5.7 (formerly known as SunNet)
+++++++++++++++++++++++++++++++++++++++

   A popular connectionist simulator with versions to run under X Windows,
   and non-graphics terminals created by Yoshiro Miyata (Chukyo Univ.,
   Japan). 60-page User's Guide in Postscript. Send any questions to
   miyata@sccs.chukyo-u.ac.jp Available for anonymous ftp from
   ftp.ira.uka.de as /pub/neuron/PlaNet5.7.tar.Z (800 kb) or from
   boulder.colorado.edu [128.138.240.1] as 
   /pub/generic-sources/PlaNet5.7.tar.Z 

5. GENESIS
++++++++++

   GENESIS 2.0 (GEneral NEural SImulation System) is a general purpose
   simulation platform which was developed to support the simulation of
   neural systems ranging from complex models of single neurons to
   simulations of large networks made up of more abstract neuronal
   components. Most current GENESIS applications involve realistic
   simulations of biological neural systems. Although the software can also
   model more abstract networks, other simulators are more suitable for
   backpropagation and similar connectionist modeling. Runs on most Unix
   platforms. Graphical front end XODUS. Parallel version for networks of
   workstations, symmetric multiprocessors, and MPPs also available.
   Available by ftp from ftp://genesis.bbb.caltech.edu/pub/genesis. Further
   information via WWW at http://www.bbb.caltech.edu/GENESIS/. 

6. Mactivation
++++++++++++++

   A neural network simulator for the Apple Macintosh. Available for ftp
   from ftp.cs.colorado.edu [128.138.243.151] as 
   /pub/cs/misc/Mactivation-3.3.sea.hqx 

7. Cascade Correlation Simulator
++++++++++++++++++++++++++++++++

   A simulator for Scott Fahlman's Cascade Correlation algorithm. Available
   for ftp from ftp.cs.cmu.edu in directory
   /afs/cs/project/connect/code/supported as the file cascor-v1.2.shar (223
   KB) There is also a version of recurrent cascade correlation in the same
   directory in file rcc1.c (108 KB). 

8. Quickprop
++++++++++++

   A variation of the back-propagation algorithm developed by Scott Fahlman.
   A simulator is available in the same directory as the cascade correlation
   simulator above in file nevprop1.16.shar (137 KB)
   (There is also an obsolete simulator called quickprop1.c (21 KB) in the
   same directory, but it has been superseeded by NevProp. See also the
   description of NevProp below.) 

9. DartNet
++++++++++

   DartNet is a Macintosh-based backpropagation simulator, developed at
   Dartmouth by Jamshed Bharucha and Sean Nolan as a pedagogical tool. It
   makes use of the Mac's graphical interface, and provides a number of
   tools for building, editing, training, testing and examining networks.
   This program is available by anonymous ftp from ftp.dartmouth.edu as 
   /pub/mac/dartnet.sit.hqx (124 KB). 

10. SNNS 4.1
++++++++++++

   "Stuttgart Neural Network Simulator" from the University of Stuttgart,
   Germany. A luxurious simulator for many types of nets; with X11
   interface: Graphical 2D and 3D topology editor/visualizer, training
   visualisation, multiple pattern set handling etc. Currently supports
   backpropagation (vanilla, online, with momentum term and flat spot
   elimination, batch, time delay), counterpropagation, quickprop,
   backpercolation 1, generalized radial basis functions (RBF), RProp, ART1,
   ART2, ARTMAP, Cascade Correlation, Recurrent Cascade Correlation, Dynamic
   LVQ, Backpropagation through time (for recurrent networks), batch
   backpropagation through time (for recurrent networks), Quickpropagation
   through time (for recurrent networks), Hopfield networks, Jordan and
   Elman networks, autoassociative memory, self-organizing maps, time-delay
   networks (TDNN), RBF_DDA, simulated annealing, Monte Carlo, Pruned
   Cascade-Correlation, Optimal Brain Damage, Optimal Brain Surgeon,
   Skeletonization, and is user-extendable (user-defined activation
   functions, output functions, site functions, learning procedures). C code
   generator snns2c. Works on SunOS, Solaris, IRIX, Ultrix, OSF, AIX, HP/UX,
   NextStep, and Linux. Distributed kernel can spread one learning run over
   a workstation cluster. Available for ftp from
   ftp.informatik.uni-stuttgart.de [129.69.211.2] in directory /pub/SNNS as 
   SNNSv4.1.tar.gz (1.4 MB, Source code) and SNNSv4.1.Manual.ps.gz (1 MB,
   Documentation). There are also various other files in this directory
   (e.g. the source version of the manual, a Sun Sparc executable, older
   versions of the software, some papers, an implementation manual, and the
   software in several smaller parts). It may be best to first have a look
   at the file SNNSv4.1.Readme. This file contains a somewhat more elaborate
   short description of the simulator. More information can be found in the
   WWW under 
   http://www.informatik.uni-stuttgart.de/ipvr/bv/projekte/snns/snns.html 

11. Aspirin/MIGRAINES
+++++++++++++++++++++

   Aspirin/MIGRAINES 6.0 consists of a code generator that builds neural
   network simulations by reading a network description (written in a
   language called "Aspirin") and generates a C simulation. An interface
   (called "MIGRAINES") is provided to export data from the neural network
   to visualization tools. The system has been ported to a large number of
   platforms. The goal of Aspirin is to provide a common extendible
   front-end language and parser for different network paradigms. The
   MIGRAINES interface is a terminal based interface that allows you to open
   Unix pipes to data in the neural network. Users can display the data
   using either public or commercial graphics/analysis tools. Example
   filters are included that convert data exported through MIGRAINES to
   formats readable by Gnuplot 3.0, Matlab, Mathematica, and xgobi. The
   software is available from two FTP sites: from CMU's simulator collection
   on pt.cs.cmu.edu [128.2.254.155] in 
   /afs/cs/project/connect/code/unsupported/am6.tar.Z and from UCLA's
   cognitive science machine ftp.cognet.ucla.edu [128.97.50.19] in 
   /pub/alexis/am6.tar.Z (2 MB). 

12. Adaptive Logic Network Educational Kit (for Windows) 
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++

   The type of neural net used in the Atree 3.0 Educational Kit (EK) package
   differs from the traditional one. Logic functions AND and OR form the
   units in all hidden layers but the first, which uses simple perceptrons.
   Though this net can't compute real-valued outputs, since its outputs
   are strictly boolean, it can easily and naturally represent real valued
   functions by giving a 0 above the function's graph and a 1 otherwise.
   This unorthodox approach is extremely useful, since it allows the user to
   impose constraints on the functions to be learned (monotonicity, bounds
   on slopes, convexity,...). Very rapid computation of functions is done by
   an ALN decision tree at whose leaves are small expressions of minimum
   and maximum operations acting on linear functions. 

   Two simple languages describe ALNs and the steps of training an ALN.
   Execution software for ALN decision trees resulting from training is
   provided in C source form for experimenters. EK and a brief guide are
   obtained by anonymous ftp from ftp.cs.ualberta.ca in directory
   /pub/atree/atree3/. Get the files atree3ek.exe and atree3ek.brief.guide. 

   An extensive User's Guide with an introduction to basic ALN theory is
   available on WWW at http://www.cs.ualberta.ca/~arms/guide/ch0.htm . This
   Educational Kit software is the same as the commercial Atree 3.0 program
   except that it allows only two input variables and is licensed for
   educational uses only. A built-in 2D and 3D plotting capability is useful
   to help the user understand how ALNs work. 

13. NeuralShell
+++++++++++++++

   Formerly available from FTP site quanta.eng.ohio-state.edu [128.146.35.1]
   as /pub/NeuralShell/NeuralShell.tar". Currently (April 94) not available
   and undergoing a major reconstruction. Not to be confused with NeuroShell
   by Ward System Group (see below under commercial software). 

14. PDP++
+++++++++

   The PDP++ software is a new neural-network simulation system written in
   C++. It represents the next generation of the PDP software released with
   the McClelland and Rumelhart "Explorations in Parallel Distributed
   Processing Handbook", MIT Press, 1987. It is easy enough for novice
   users, but very powerful and flexible for research use.
   The current version is 1.0, our first non-beta release. It has been
   extensively tested and should be completely usable. Works on Unix with
   X-Windows.
   Features: Full GUI (InterViews), realtime network viewer, data viewer,
   extendable object-oriented design, CSS scripting language with
   source-level debugger, GUI macro recording. 
   Algorithms: Feedforward and several recurrent BP, Boltzmann machine,
   Hopfield, Mean-field, Interactive activation and competition, continuous
   stochastic networks. 
   The software can be obtained by anonymous ftp from 
   ftp://hydra.psy.cmu.edu/pub/pdp++/ and from 
   ftp://unix.hensa.ac.uk/mirrors/pdp++/.
   For more information, see our WWW page at 
   http://www.cs.cmu.edu/Web/Groups/CNBC/PDP++/PDP++.html.
   There is a 250 page (printed) manual and an HTML version available
   on-line at the above address. 

15. Uts (Xerion, the sequel)
++++++++++++++++++++++++++++

   Uts is a portable artificial neural network simulator written on top of
   the Tool Control Language (Tcl) and the Tk UI toolkit. As result, the
   user interface is readily modifiable and it is possible to simultaneously
   use the graphical user interface and visualization tools and use scripts
   written in Tcl. Uts itself implements only the connectionist paradigm of
   linked units in Tcl and the basic elements of the graphical user
   interface. To make a ready-to-use package, there exist modules which use
   Uts to do back-propagation (tkbp) and mixed em gaussian optimization
   (tkmxm). Uts is available in ftp.cs.toronto.edu in directory /pub/xerion.

16. Neocognitron simulator
++++++++++++++++++++++++++

   The simulator is written in C and comes with a list of references which
   are necessary to read to understand the specifics of the implementation.
   The unsupervised version is coded without (!) C-cell inhibition.
   Available for anonymous ftp from unix.hensa.ac.uk [129.12.21.7] in 
   /pub/neocognitron.tar.Z (130 kB). 

17. Multi-Module Neural Computing Environment (MUME)
++++++++++++++++++++++++++++++++++++++++++++++++++++

   MUME is a simulation environment for multi-modules neural computing. It
   provides an object oriented facility for the simulation and training of
   multiple nets with various architectures and learning algorithms. MUME
   includes a library of network architectures including feedforward, simple
   recurrent, and continuously running recurrent neural networks. Each
   architecture is supported by a variety of learning algorithms. MUME can
   be used for large scale neural network simulations as it provides support
   for learning in multi-net environments. It also provide pre- and
   post-processing facilities. The modules are provided in a library.
   Several "front-ends" or clients are also available. X-Window support by
   editor/visualization tool Xmume. MUME can be used to include non-neural
   computing modules (decision trees, ...) in applications. MUME is
   available for educational institutions by anonymous ftp on
   mickey.sedal.su.oz.au [129.78.24.170] after signing and sending a
   licence: /pub/license.ps (67 kb). Contact: Marwan Jabri, SEDAL, Sydney
   University Electrical Engineering, NSW 2006 Australia,
   marwan@sedal.su.oz.au 

18. LVQ_PAK, SOM_PAK
++++++++++++++++++++

   These are packages for Learning Vector Quantization and Self-Organizing
   Maps, respectively. They have been built by the LVQ/SOM Programming Team
   of the Helsinki University of Technology, Laboratory of Computer and
   Information Science, Rakentajanaukio 2 C, SF-02150 Espoo, FINLAND There
   are versions for Unix and MS-DOS available from cochlea.hut.fi
   [130.233.168.48] as /pub/lvq_pak/lvq_pak-2.1.tar.Z (340 kB, Unix sources),
   /pub/lvq_pak/lvq_p2r1.exe (310 kB, MS-DOS self-extract archive), 
   /pub/som_pak/som_pak-1.2.tar.Z (251 kB, Unix sources), 
   /pub/som_pak/som_p1r2.exe (215 kB, MS-DOS self-extract archive). (further
   programs to be used with SOM_PAK and LVQ_PAK can be found in /pub/utils).

19. Nevada Backpropagation (NevProp)
++++++++++++++++++++++++++++++++++++

   NevProp is a free, easy-to-use feedforward backpropagation (multilayer
   perceptron) program. It uses an interactive character-based interface,
   and is distributed as C source code that should compile and run on most
   platforms. (Precompiled executables are available for Macintosh and DOS.)
   The original version was Quickprop 1.0 by Scott Fahlman, as translated
   from Common Lisp by Terry Regier. We added early-stopped training based
   on a held-out subset of data, c index (ROC curve area) calculation, the
   ability to force gradient descent (per-epoch or per-pattern), and
   additional options. FEATURES (NevProp version 1.16): UNLIMITED (except by
   machine memory) number of input PATTERNS; UNLIMITED number of input,
   hidden, and output UNITS; Arbitrary CONNECTIONS among the various layers'
   units; Clock-time or user-specified RANDOM SEED for initial random
   weights; Choice of regular GRADIENT DESCENT or QUICKPROP; Choice of
   PER-EPOCH or PER-PATTERN (stochastic) weight updating; GENERALIZATION to
   a test dataset; AUTOMATICALLY STOPPED TRAINING based on generalization;
   RETENTION of best-generalizing weights and predictions; Simple but useful
   GRAPHIC display to show smoothness of generalization; SAVING of results
   to a file while working interactively; SAVING of weights file and
   reloading for continued training; PREDICTION-only on datasets by applying
   an existing weights file; In addition to RMS error, the concordance, or c
   index is displayed. The c index (area under the ROC curve) shows the
   correctness of the RELATIVE ordering of predictions AMONG the cases; ie,
   it is a measure of discriminative power of the model. AVAILABILITY: The
   most updated version of NevProp will be made available by anonymous ftp
   from the University of Nevada, Reno: On ftp.scs.unr.edu [134.197.10.130]
   in the directory "pub/goodman/nevpropdir", e.g. README.FIRST (45 kb) or 
   nevprop1.16.shar (138 kb). Version 2 (not yet released) is intended to
   have some new features: more flexible file formatting (including access
   to external data files; option to prerandomize data order; randomized
   stochastic gradient descent; option to rescale predictor (input)
   variables); linear output units as an alternative to sigmoidal units for
   use with continuous-valued dependent variables (output targets);
   cross-entropy (maximum likelihood) criterion function as an alternative
   to square error for use with categorical dependent variables
   (classification/symbolic/nominal targets); and interactive interrupt to
   change settings on-the-fly. Limited support is available from Phil
   Goodman (goodman@unr.edu), University of Nevada Center for Biomedical
   Research. 

20. Fuzzy ARTmap
++++++++++++++++

   This is just a small example program. Available for anonymous ftp from
   park.bu.edu [128.176.121.56] ftp://cns-ftp.bu.edu/pub/fuzzy-artmap.tar.Z
   (44 kB). 

21. PYGMALION
+++++++++++++

   This is a prototype that stems from an ESPRIT project. It implements
   back-propagation, self organising map, and Hopfield nets. Avaliable for
   ftp from ftp.funet.fi [128.214.248.6] as 
   /pub/sci/neural/sims/pygmalion.tar.Z (1534 kb). (Original site is
   imag.imag.fr: archive/pygmalion/pygmalion.tar.Z). 

22. Basis-of-AI-NN Software
+++++++++++++++++++++++++++

   DOS and UNIX C source code, examples and DOS binaries are available in
   the following different program sets: 

      [backprop, quickprop, delta-bar-delta, recurrent networks],
      [simple clustering, k-nearest neighbor, LVQ1, DSM],
      [Hopfield, Boltzman, interactive activation network],
      [interactive activation network],
      [feedforward counterpropagation],
      [ART I],
      [a simple BAM] and
      [the linear pattern classifier]
      

   For details see: Basis of AI NN software at
   http://www.mcs.com/~drt/svbp.html . 

   An improved professional version of backprop is also available, $30 for
   regular people, $200 for businesses and governmental agencies. See: Basis
   of AI Professional Backprop at http://www.mcs.com/~drt/probp.html . 

   Questions to: Don Tveter, drt@mcs.com 

23. Matrix Backpropagation
++++++++++++++++++++++++++

   MBP (Matrix Back Propagation) is a very efficient implementation of the
   back-propagation algorithm for current-generation workstations. The
   algorithm includes a per-epoch adaptive technique for gradient descent.
   All the computations are done through matrix multiplications and make use
   of highly optimized C code. The goal is to reach almost peak-performances
   on RISCs with superscalar capabilities and fast caches. On some machines
   (and with large networks) a 30-40x speed-up can be measured with respect
   to conventional implementations. The software is available by anonymous
   ftp from risc6000.dibe.unige.it [130.251.89.154] as /pub/MBPv1.1.tar.Z
   (Unix version), /pub/MBPv11.zip.Z (MS-DOS version), /pub/mpbv11.ps
   (Documentation). For more information, contact Davide Anguita
   (anguita@dibe.unige.it). 

24. WinNN
+++++++++

   WinNN is a shareware Neural Networks (NN) package for windows 3.1. WinNN
   incorporates a very user friendly interface with a powerful computational
   engine. WinNN is intended to be used as a tool for beginners and more
   advanced neural networks users, it provides an alternative to using more
   expensive and hard to use packages. WinNN can implement feed forward
   multi-layered NN and uses a modified fast back-propagation for training.
   Extensive on line help. Has various neuron functions. Allows on the fly
   testing of the network performance and generalization. All training
   parameters can be easily modified while WinNN is training. Results can be
   saved on disk or copied to the clipboard. Supports plotting of the
   outputs and weight distribution. Available for ftp from
   ftp.cc.monash.edu.au as /pub/win3/programr/winnn97.zip (747 kB). 

25. BIOSIM
++++++++++

   BIOSIM is a biologically oriented neural network simulator. Public
   domain, runs on Unix (less powerful PC-version is available, too), easy
   to install, bilingual (german and english), has a GUI (Graphical User
   Interface), designed for research and teaching, provides online help
   facilities, offers controlling interfaces, batch version is available, a
   DEMO is provided. REQUIREMENTS (Unix version): X11 Rel. 3 and above,
   Motif Rel 1.0 and above, 12 MB of physical memory, recommended are 24 MB
   and more, 20 MB disc space. REQUIREMENTS (PC version): PC-compatible with
   MS Windows 3.0 and above, 4 MB of physical memory, recommended are 8 MB
   and more, 1 MB disc space. Four neuron models are implemented in BIOSIM:
   a simple model only switching ion channels on and off, the original
   Hodgkin-Huxley model, the SWIM model (a modified HH model) and the
   Golowasch-Buchholz model. Dendrites consist of a chain of segments
   without bifurcation. A neural network can be created by using the
   interactive network editor which is part of BIOSIM. Parameters can be
   changed via context sensitive menus and the results of the simulation can
   be visualized in observation windows for neurons and synapses. Stochastic
   processes such as noise can be included. In addition, biologically
   orientied learning and forgetting processes are modeled, e.g.
   sensitization, habituation, conditioning, hebbian learning and
   competitive learning. Three synaptic types are predefined (an
   excitatatory synapse type, an inhibitory synapse type and an electrical
   synapse). Additional synaptic types can be created interactively as
   desired. Available for ftp from ftp.uni-kl.de in directory
   /pub/bio/neurobio: Get /pub/bio/neurobio/biosim.readme (2 kb) and 
   /pub/bio/neurobio/biosim.tar.Z (2.6 MB) for the Unix version or 
   /pub/bio/neurobio/biosimpc.readme (2 kb) and 
   /pub/bio/neurobio/biosimpc.zip (150 kb) for the PC version. Contact:
   Stefan Bergdoll; Department of Software Engineering (ZXA/US); BASF Inc.;
   D-67056 Ludwigshafen; Germany; bergdoll@zxa.basf-ag.de; phone
   0621-60-21372; fax 0621-60-43735 

26. The Brain
+++++++++++++

   The Brain is an advanced neural network simulator for PCs that is simple
   enough to be used by non-technical people, yet sophisticated enough for
   serious research work. It is based upon the backpropagation learning
   algorithm. Three sample networks are included. The documentation included
   provides you with an introduction and overview of the concepts and
   applications of neural networks as well as outlining the features and
   capabilities of The Brain. The Brain requires 512K memory and MS-DOS or
   PC-DOS version 3.20 or later (versions for other OS's and machines are
   available). A 386 (with maths coprocessor) or higher is recommended for
   serious use of The Brain. Shareware payment required. Demo version is
   restricted to number of units the network can handle due to memory
   contraints on PC's. Registered version allows use of extra memory.
   External documentation included: 39Kb, 20 Pages. Source included: No
   (Source comes with registration). Available via anonymous ftp from
   ftp.tu-clausthal.de as /pub/msdos/science/brain12.zip (78 kb) and from
   ftp.technion.ac.il as /pub/contrib/dos/brain12.zip (78 kb) Contact: David
   Perkovic; DP Computing; PO Box 712; Noarlunga Center SA 5168; Australia;
   Email: dip@mod.dsto.gov.au (preferred) or dpc@mep.com or
   perkovic@cleese.apana.org.au 

27. FuNeGen 1.0
+++++++++++++++

   FuNeGen is a MLP based software program to generate fuzzy rule based
   classifiers. A limited version (maximum of 7 inputs and 3 membership
   functions for each input) for PCs is available for anonymous ftp from
   obelix.microelectronic.e-technik.th-darmstadt.de in directory 
   /pub/neurofuzzy. For further information see the file read.me. Contact:
   Saman K. Halgamuge 

28. NeuDL -- Neural-Network Description Language
++++++++++++++++++++++++++++++++++++++++++++++++

   NeuDL is a description language for the design, training, and operation
   of neural networks. It is currently limited to the backpropagation
   neural-network model; however, it offers a great deal of flexibility. For
   example, the user can explicitly specify the connections between nodes
   and can create or destroy connections dynamically as training progresses.
   NeuDL is an interpreted language resembling C or C++. It also has
   instructions dealing with training/testing set manipulation as well as
   neural network operation. A NeuDL program can be run in interpreted mode
   or it can be automatically translated into C++ which can be compiled and
   then executed. The NeuDL interpreter is written in C++ and can be easly
   extended with new instructions. NeuDL is available from the anonymous ftp
   site at The University of Alabama: cs.ua.edu (130.160.44.1) in the file 
   /pub/neudl/NeuDLver021.tar. The tarred file contains the interpreter
   source code (in C++) a user manual, a paper about NeuDL, and about 25
   sample NeuDL programs. A document demonstrating NeuDL's capabilities is
   also available from the ftp site: /pub/neudl/NeuDL/demo.doc 
   /pub/neudl/demo.doc. For more information contact the author: Joey Rogers
   (jrogers@buster.eng.ua.edu). 

29. NeoC Explorer (Pattern Maker included)
++++++++++++++++++++++++++++++++++++++++++

   The NeoC software is an implementation of Fukushima's Neocognitron neural
   network. Its purpose is to test the model and to facilitate interactivity
   for the experiments. Some substantial features: GUI, explorer and tester
   operation modes, recognition statistics, performance analysis, elements
   displaying, easy net construction. PLUS, a pattern maker utility for
   testing ANN: GUI, text file output, transformations. Available for
   anonymous FTP from OAK.Oakland.Edu (141.210.10.117) as 
   /SimTel/msdos/neurlnet/neocog10.zip (193 kB, DOS version) 

30. AINET
+++++++++

   aiNet is a shareware Neural Networks (NN) application for MS-Windows 3.1.
   It does not require learning, has no limits in parameters (input & output
   neurons), no limits in sample size. It is not sensitive toward noise in
   the data. Database can be changed dynamically. It provides a way to
   estimate the rate of error in your prediction. Missing values are handled
   automatically. It has graphical spreadsheet-like user interface and
   on-line help system. It provides also several different charts types.
   aiNet manual (90 pages) is divided into: "User's Guide", "Basics About
   Modeling with the AINET", "Examples". Special requirements: Windows 3.1,
   VGA or better. Can be downloaded from 
   ftp://ftp.cica.indiana.edu/pub/pc/win3/programr/ainet100.zip or from 
   ftp://oak.oakland.edu/SimTel/win3/math/ainet100.zip 

31. DemoGNG
+++++++++++

   This simulator is written in Java and should therefore run without
   compilation on all platforms where a Java interpreter (or a browser with
   Java support) is available. It implements the following algorithms and
   neural network models: 
    o Hard Competitive Learning (standard algorithm) 
    o Neural Gas (Martinetz and Schulten 1991) 
    o Competitive Hebbian Learning (Martinetz and Schulten 1991, Martinetz
      1993) 
    o Neural Gas with Competitive Hebbian Learning (Martinetz and Schulten
      1991) 
    o Growing Neural Gas (Fritzke 1995) 
   DemoGNG is distributed under the GNU General Public License. It allows to
   experiment with the different methods using various probability
   distributions. All model parameters can be set interactively on the
   graphical user interface. A teach modus is provided to observe the models
   in "slow-motion" if so desired. It is currently not possible to
   experiment with user-provided data, so the simulator is useful basically
   for demonstration and teaching purposes and as a sample implementation of
   the above algorithms. 

   DemoGNG can be accessed most easily at 
   http://www.neuroinformatik.ruhr-uni-bochum.de/ in the file 
   /ini/VDM/research/gsn/DemoGNG/GNG.html where it is embedded as Java
   applet into a Web page and is downloaded for immediate execution when you
   visit this page. An accompanying paper entitled "Some competitive
   learning methods" describes the implemented models in detail and is
   available in html at the same server in the directory 
   ini/VDM/research/gsn/JavaPaper/. 

   It is also possible to download the complete source code and a Postscript
   version of the paper via anonymous ftp from
   ftp.neuroinformatik.ruhr-uni-bochum [134.147.176.16] in directory
   /pub/software/NN/DemoGNG/. The software is in the file 
   DemoGNG-1.00.tar.gz (193 KB) and the paper in the file sclm.ps.gz (89
   KB). There is also a README file (9 KB). Please send any comments and
   questions to demogng@neuroinformatik.ruhr-uni-bochum.de which will reach
   Hartmut Loos who has written DemoGNG as well as Bernd Fritzke, the author
   of the accompanying paper. 

For some of these simulators there are user mailing lists. Get the packages
and look into their documentation for further info.

If you are using a small computer (PC, Mac, etc.) you may want to have a
look at the Central Neural System Electronic Bulletin Board (see question 
"Other sources of information"). Modem: 409-737-5222; Sysop: Wesley R.
Elsberry; 4160 Pirates' Beach, Galveston, TX, USA; welsberr@orca.tamu.edu.
There are lots of small simulator packages, the CNS ANNSIM file set. There
is an ftp mirror site for the CNS ANNSIM file set at me.uta.edu
[129.107.2.20] in the /pub/neural directory. Most ANN offerings are in 
/pub/neural/annsim. 

------------------------------------------------------------------------

Next part is part 6 (of 7). Previous part is part 4. @

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
