
           INSTRUCTIONS FOR USING THE NEURAL NETWORK SIMULATOR
                              Eric Melz 
                            June 26, 1993

Overview
--------
The simulate program is an easy-to-use and flexible neural network
simulator.  It is designed to handle the simulation of associative
learning style networks, such as the LMS (aka Rescorla-Wagner)
learning procedure and Backpropagation.  Multiple interacting networks
can be simulated in parallel, with their behavior and interactions
specified by a simple command language.

The program is designed to be as flexible as possible.  The network
specification, training sets, and network weights are contained in
separate files so that different versions of a simulation can be
easily created and maintained.  The structure and behavior of networks
are specified using simple high-level constructs, which makes
modification of a simulation easy.  For example, a single-layer
linear network can be changed to a multiple-layer backpropagation
network with a matter of keystrokes.

Some of the more useful features of the simulator include:

* Simulation weights can be saved, and later used to intialize the
  weights of a new simulation.

* Multilayer networks with arbitrary interconnectivity can be created.

* Quickprop, a faster version of backpropagation is supported.

* Network behavior on each cycle in the simulation can be specified. 

* The output of the simulation on each cycle can be specified.


Compiling the Program
---------------------

The files simulate.c and simulate.h are the only files needed for
compilation.  The simulate program uses math functions, so the C math
libaries must be included.  Follow the standard compilation procedure
for your machine.  For example, in UNIX type

% cc simulate.c -o simulate -lm


Invoking the Program
--------------------

The syntax for invoking the program is:

% simulate <NetFile> <TsFile> <InWtFile> <OutWtFile>

<NetFile> is the name of the file which contains the network and
simulation specification.
	
<TsFile> is the name of the file that contains the training set.

<InWtFile> is the name of the file which contains the initial weights.

<OutWtFile> is the name of the file in which the final simulation
weights will be saved.

<InWtFile> and <OutWtFile> are optional, and may be left unspecified.
If weights are be to saved, but no Initial Weight file is needed,
substitute a dash (-) for the <InWtFile>.

The program will ask for a random number seed.  This can be typed
interactively, or alternatively, the random number can be specified on
the command line using input redirection.  For example, to specify a
random number 34, type

% echo 34 | simulate <NetFile> <TsFile> <InWtFile> <OutWtFile> 


The Network File
----------------

The network file consists of the 3 sections: (1) Global simulation
parameters, (2) Network definitions, and (3) Simulation commands.
The sections begin with the labels GLOBAL, NETWORK, and COMMANDS,
respectively (see the example network files).

The syntax for parameters or commands is the name of the
parameter/command followed by a colon, and then a list of the relevant
parameters.  A comment may be placed after a parameter/command or on a
line by itself by preceding the comment with a hash mark (#).  For
example, 

WEIGHT-MIN    :-.5       # The mininimum value of an initial weight

UNITS         : 10 5 10  # 10 Units in the input&output layers, 5 in
                         # the hidden layer

are valid parameter definitions.


The Training Set File
---------------------

The training set file begins with a specification of the number of
input units, the number of output units, and also the number of units
on any intermediate layers that are to be treated as input layers.
The syntax for these declarations is:
IN : <InUnits> 
OUT: <OutUnits>
Ln : <IntermediateUnits>

Where n is the layer number that is to be treated as an input layer.
(See the file hinton.ts for an example where intermediate layers are
treated as input layers).

After the unit declarations, a list of the training patterns should be
specified in the format

<InputPat1> <IntermediatePat1> <OutputPat1>
<InputPat2> <IntermediatePat2> <OutputPat2>
               . . .
<InputPatN> <IntermediatePatN> <OutputPatN>

Where <InputPatN> is the a vector of the values to be clamped on the
input layer, <IntermediatePatN> is a vector of the values to be
clamped on the intermediate layers (if any have been specified), and
<OutputPatN> is the training-pattern vector to be presented on the
output layer.

Like the network file, comments may be placed in the training set file by
preceding them with a hash mark.


The Examples
------------
Several example network files have been included with the simulator.  
The best way to get started is to adapt an example which is most
similar to your application.  The example directories are:

ENCODER- contains simulation files for a 10-5-10 encoder (described in
(Rumelhart & McClelland, 1986 v.1).  Demonstrates a simple
backpropagation network.

HINTON - contains simulation files for Hinton's "family tree problem"
(Hinton, 1986).  This demonstrates a more sophisticated example of
backpropagation: the network is six layers instead of 3.
hinton-bp.net is a backpropagation version of the network, and
hinton-qp.net is a quickpropagation version of the network.

SHANKS - contains simulation files for some simple linear associative
networks (discussed in Shanks, 1991).  A couple UNIX scripts are
included in this directory which illustrate how many simulations can
be performed with a single script. 
