                        NEURAL NETWORK PC TOOLS
                           COMPANION SOFTWARE
                              USER'S GUIDE

       Copyright (c) 1990, Russell C. Eberhart and Roy W. Dobbins

     $Revision:   1.5  $	    $Date:   22 Oct 1990 10:13:38  $


				  INTRODUCTION

     The software in this package is described in detail in our book,
     entitled Neural Network PC Tools: A Practical Guide, published by
     Academic Press in 1990.  This software may be copied and
     distributed in accordance with the principles of shareware, AS LONG
     AS IT IS NOT MODIFIED.  In particular, any problems with the source
     code should be brought to the attention of the authors.

     If you use this software, consider it as shareware and please send
     $20.00 to the authors at the following address:  Roy Dobbins, 5833
     Humblebee Road, Columbia, MD 21045.  If you live outside the United
     States or Canada, please sent $26.00 US to help defray the
     additional cost of air mail.

     In addition to this readme file, the Neural Network PC Tools
     (NNPCT) Companion Software consists of files in six subdirectories:
     MISC, KOHONEN, BATCHNET, MUSIC, INCLUDE AND LIB.  As originally
     distributed, the first four of these subdirectories contain one
     zipped file each.  For example, the BATCHNET subdirectory contains
     the file backprop.exe.  So, before you can use the software you
     must unzip it.  The files unzip themselves: all you have to do is
     type the name of the *.exe file.  For example, to unzip all of the
     files in the BATCHNET subdirectory, just type "backprop".

     The INCLUDE and LIB subdirectories have files all ready to go; you
     don't have to unzip them.  Be aware, since this software is being
     distributed as Shareware, that you may get the files already
     unzipped in some or all of the subdirectories.

     Additional neural network software is also available as shareware.
     The NNPCT Companion Software is one module in a suite of neural net
     software called The Neural Network Toolkit.  The Neural Network
     Toolkit includes, in addition to the NNPCT Companion Software,
     versions of backpropagation and Kohonen networks with graphics
     interfaces.  You can watch the values of the network weights and
     activation values change as the network trains, either in graphical
     or alphanumeric format.  The Neural Network Toolkit currently costs
     $40.00 ($46.00 outside the U.S. and Canada).  If you are already a
     registered owner of the NNPCT Companion Software, you can get the
     rest of the Neural Network Toolkit for an additional $20.00 ($26.00
     outside the U.S. and Canada).  As we add network paradigms to the
     Toolkit, it is expected that the price will increase somewhat.



				   BACKGROUND

     Much excitement exists due to the apparent ability of artificial
     neural networks to imitate the brain's ability to make decisions and
     draw conclusions when presented with complex, noisy and/or partial
     information.  This software is for the engineer or programmer who is
     interested in solving practical problems with neural networks.

     It is a myth that the only way to achieve results with neural networks
     is with a million dollars, a supercomputer, and an interdisciplinary
     team of Nobel laureates.  There are some commercial vendors out there
     who would like you to believe that, though.

     Using simple hardware and software tools, it is possible to solve
     practical problems that are otherwise impossible or impractical.
     Neural network tools (NNT's) offer a solution to some problems that
     can't be solved any other way known to the authors.


		       THE BACK-PROPAGATION NNT: BATCHNET

     In the BATCHNET subdirectory you will find both source and
     executable code for a "standard" three layer back-propagation
     neural network.  (Remember to unzip the files if necessary.)  The
     executable program is called batchnet.exe; its source code is in
     the file batchnet.c; it can be compiled using Turbo C 2.0,
     Microsoft C 5.1, or compatible C compilers from other vendors.

     It was compiled using the 80x87 emulator mode, so that it runs even
     if you don't have a co-processor.  If you have a coprocessor and
     want batchnet to run faster, which may be especially important in
     training, you can recompile batchnet.c using the 80x87 option.
     Always use the compact memory model.

     To run the batchnet program, you must specify the name of the run
     file that it must use.  Demo.run is the run file for the demo.bat
     demonstration.  Look at the demo.bat and demo.run files to see what
     we mean.

     Demo.bat also illustrates one of the options for batchnet.  You can
     specify the interval of iterations between average sum-squared error
     printouts with the -e option:  -e10 prints it out each 10 iterations.
     The default number of iterations between error printouts is 100.

     The other option for batchnet is to specify what average sum
     squared error (per output node, per pattern) is required for the
     program to terminate training.  The default value is 0.02: a
     command option of -d.01 overrides this with an error termination
     value of .01.

     In the run file, you specify a number of things.  Look at demo.run in
     detail to see what they are; there is explanation following the run
     data for the two runs that tell you what goes where.

     First, you specify the number of runs.  The demo has two.	This is
     fairly typical.  You often have a training run followed by a test run,
     as is the case in the demo.  You can, however, set up the software to
     do as many runs as you want: hence the name "batchnet".

     You then specify the filenames for a number of files: the output file
     that gives the values of the output nodes for each pattern on the last
     iteration (or the only iteration, if you are in testing mode), the
     error file that gives you the average sum squared error value each
     specified number of iterations, the source pattern file (values
     normalized between 0 and 1), the input weights file (can be
     automatically generated for a training run, and consisting of the
     output weights file from training for a testing run), and the
     output weights file which gives you weight values after the last
     iteration.

     Note that the pattern files have values for each input node followed
     by values for each output node followed by an ID field that you can
     use to identify each pattern in some way.	The input and output node
     values should be between 0 and 1.

     Following filenames, you specify, for each run, the number of input
     patterns, the number of epochs (iterations of entire pattern set), the
     number of input nodes, number of hidden nodes, number of output nodes,
     the value for the learning coefficient (eta), and the value for the
     momentum factor (alpha).  The number of epochs varies a lot during
     training, but often is in the range 100 - 1000; for testing, you
     only do one iteration.

     Sample files are given that you can run with demo.bat; the output
     files you will get when you run the demo are already on the diskette
     as mytest.out, mytrain.out, mytrain.wts, mytest.wts, mytrain.err, and
     mytest.err.  You will get similar files without the "my" prefix when
     you run the demo.bat program, and you can compare corresponding files
     to see that they are the same.

     All you have to do is run "demo.bat" in order to both train and test
     the batchnet artificial neural network on the patterns in the
     train.pat and test.pat files.  These pattern files are built from
     actual electroencephalogram (EEG) spike parameter data, and illustrate
     the use of a parameter-based NNT.

     The training phase of demo.bat takes about 45 minutes on a 4.77 MHz
     8088 PC with coprocessor; 18 minutes on a 12 MHz Compaq with
     coprocessor;  140 minutes on a 10 MHz Grid 80286 Laptop with no
     coprocessor.  The coprocessor makes the difference!


				HINTON DIAGRAMS

     OVERVIEW

     The program hinton.exe in the MISC subdirectory displays Hinton
     diagrams - graphical representations of neural network weights.
     (Remember to unzip the files if necessary.)  The program assumes
     that the weights for a three layer network have been stored in a
     disk file as ASCII floating point numbers.  An example of a valid
     weights file that you have on this shareware diskette is
     mytrain.wts.  An example of how to invoke the program is given in
     hintdemo.bat, which calls in the weights in mytrain.wts, with the
     correct network structure (9 input, 4 hidden, 2 output neurodes).

     SYSTEM REQUIREMENTS

     You need a PC with EGA or VGA to get color output with this
     program. We have tried it on a CGA too; it ran OK, but the
     resolution of the output left a lot to be desired.  Ensure that the
     necessary graphics driver files are all present in the directory
     from which HINTON.EXE is run:

       HINTON.EXE
       EGAVGA.BGI
       HERC.BGI
       CGA.BGI
       HINTDEMO.BAT (needed only if you are running the demo)

     USE

     To use the program, at the DOS prompt type:

	hinton {-c} datafile input hidden output
	-c	      no color
	datafile      name of data file
	input	      number of units in input layer
	hidden	      number of units in hidden layer
	output	      number of units in output layer

     Use the -c option if you have a monochrome screen or if you want to
     make hardcopies of a CGA screen.

     Currently HINTON.EXE only works with three-layer (one hidden layer)
     feedforward networks.

     DATA FILE ORGANIZATION

     The file must be in the form of ASCII text floating point numbers, in
     the order given below:

     data_file is :-
	input_layer_to_hidden_layer_weights
	hidden_layer_to_output_layer_weights

	input_layer_to_hidden_layer_weights is:-
	      weights_for_hidden_unit 0
	      weights_for_hidden_unit 1
	      weights_for_hidden_unit 2
	      ...
	      weights_for_hidden_unit h-1

	hidden_layer_to_output_layer_weights is :-
	      weights_for_output_unit_0
	      weights_for_output_unit_1
	      weights_for_output_unit_2
	      ...
	      weights_for_output_unit_o-1

	      weights_for_hidden_unit_n is :-
		    weight from input unit 0 to hidden unit n
		    weight from input unit 1 to hidden unit n
		    weight from input unit 2 to hidden unit n
		    ...
		    weight from input unit i-1 to hidden unit n
		    weight from bias unit to hidden unit n

	      weights_for_output_unit_n is :-
		    weight from hidden unit 0 to output unit n
		    weight from hidden unit 1 to output unit n
		    weight from hidden unit 2 to output unit n
		    ...
		    weight from hidden unit h-1 to output unit n
		    weight from bias unit to output unit n

     Note that you must have the weights from the bias units present in the
     file; they are displayed as the leftmost column on the screen.

     MENU

     When you run the program, you see the weights to the hidden layer from
     the input (Hidden) at the top of the screen (in blue and white if you
     have a color monitor).  Toward the bottom of the screen (in red and
     white if you have a color monitor) are the hidden to output (Output)
     weights.  The main menu consists of the following commands, displayed
     in a bar at the bottom of the screen:

	Hidden	Out  View  Clear  Zoom	Shrink	Flip  Unit  Range  Quit

     Under the main menu bar is the version of the software, and an
     indication of the window (Hidden or Output) for which commands are
     activated, as well as the units and range being displayed.

     Brief description of main menu commands:

     Hidden
     Activate the hidden layer window commands.  Does not alter the
     display, but all future commands are directed to this window.

     Out
     Activate the output layer window commands.

     View
     Display (or re-display) the data in the current window.  Values are
     displayed as small filled rectangles.  The area of a rectangle is
     proportional to the magnitude, while the color and fill pattern
     indicate the sign.  Currently, positive numbers are displayed in white
     while negative numbers are displayed in color, the color varying from
     layer to layer - blue for hidden, red for output.


     Clear
     Clear the current window (does not alter the data in any way; merely
     erases the display window).

     Zoom
     Increase the magnification of the current window.	The opposite of
     Shrink.  Data is scaled up and appears larger in the window.  It is
     possible that some of the image will be clipped, if it now falls
     outside the window boundaries.

     Shrink
     Decrease the magnification of the current window.	The opposite of
     Zoom.  Data is scaled down and appears smaller in the window.  The
     minimum shrinkage is down to the level of one pixel, after which
     further Shrink commands are ignored.

     Flip
     Turn the image in the window through 90 degrees.  Horizontally
     organized data is displayed vertically and vice versa.  This command
     acts as a toggle.	A subsequent Flip command rotates the image back
     through 90 degrees back to its original orientation.

     Unit
     Specify the unit(s) of the current layer, that are to be displayed in
     the window.  The default is all units.  You can enter a single unit or
     you can enter a range as a pair of numbers.  For example, to display
     units 10 through 15, enter:  10 15

     Range
     Specify the units of the input to the current layer that are to be
     displayed in the window.  The default is all input units.	You can
     enter a single unit or you can enter a range as a pair of numbers.

     Quit
     Quit the program.	The mode of the screen is changed from graphics
     back to the normal text mode of the PC.

     EXAMPLE OF RUNNING THE HINTON PROGRAM

     To look at the weight values in the file "mytrain.wts", which has
     weights for a 9-4-2 node backprop network, just run the hinton.exe
     program with the following command: hinton mytrain.wts 9 4 2
     (You do the same thing by running: hintdemo.bat)

     You then see a screen with the main menu prompts below.  You are by
     default in the hidden layer window that activates the Hidden window.

     To activate the Output layer window commands, hit O for output layer.
     This puts you in the output window.  Then hit Z for zoom, F for flip,
     etc.

     To get a hardcopy of the Hinton diagrams, you must load your favorite
     hot key utility for your CGA, EGA or VGA screen.  Furthermore, to get
     a good representation on a black and white printer, you should run
     hinton -c to suppress on-screen color information, as indicated in the
     command line description.


		      KOHONEN SELF-ORGANIZING NETWORK DEMO

     The program kohodemo.exe in the MISC subdirectory is a simple
     demonstration (warts and all) of a Kohonen-style self-organization
     network with two-dimensional input.  The source code was written in
     C, and is meant to be a faithful re-creation of the program called
     "Toprem2" that Dr. Kohonen listed Pascal source code for at the end
     of his tutorial presentation material for the 1989 International
     Joint Conference on Neural Networks in Washington, D. C.

     After running the program by typing in "kohodemo", you are given
     the choice of three geometric shapes (square, circle, cross) or
     quitting the program.  When you select the shape, you can watch the
     network weights go through topological ordering, and then watch
     them self-organize to (more or less) conform to the shape you've
     selected.


                    KOHONEN SELF-ORGANIZING NETWORK

     In the KOHONEN subdirectory is a Kohonen-style self-organizing
     network implementation, complete with C source code.  It is
     substantially different from the kohonen demo described above in
     that it is fully functional self-organizing network software that
     you can use with your own input data.  It allows higher dimensional
     input, and has several displays, including weight vectors and
     activation values.

     A demonstration of the software can be seen by running the demo.bat
     file.  You'll see that the demo.bat file uses the demo.run
     file.  Inside the demo.run file, after the information needed to
     run the program, is a detailed description of what is needed in a
     run file for the Kohonen software.  The demo.bat file also has a
     description of the command line options that can be used with the
     Kohonen network implementation.

     After you start running the program, either with the demo or with
     your own input pattern files, you'll see a menu and some
     information at the top of the screen, and three windows that take
     up the rest of the screen.  In the upper left window are the
     activation values of the output neurodes.  The upper right window
     displays the value for an input pattern.  The lower window displays
     the weights from input to output.  When the program starts running,
     all windows are displaying values in numerical form.

     To see graphical representations in the windows, hit the T key;
     this acts as a toggle between graphics and numerical output.  By
     hitting the + and - keys, you can move among pattern numbers. Note
     that the current epoch and pattern numbers are displayed on the
     third line at the top of the screen.

     If you hit Ctrl-Q, you quit unconditionally.  Hitting the ESC key
     suspends operation; hitting it a second time quits the program and
     saves the weight and output values in files kohonen.wts and
     kohonen.out, respectively.  The weight and output values are saved
     automatically if the program runs to completion.

     To stop temporarily, hit the M key.  This also lets you change
     modes. After hitting M, you are given a choice of Pattern or All.
     If you select Pattern, then Winner, you will see the winning
     neurode for that pattern.  If you select Pattern, then All, you'll
     see the values for all output neurodes for that pattern.

     Selecting All, then Averages, lets you see the averages of all
     outputs for all input patterns.  Selecting All, then Winners, lets
     you see all of the winning neurode values for all of the patterns.
     This last combination lets you see the clustering that is done by
     the network on the input pattern data.

     The demo displays the outputs for each epoch, or iteration.  If you
     choose to display the outputs only once each 10 iterations with a
     -e10 on the command line, you can override this and display the
     outputs any time you wish by hitting the E (display) key.  In other
     words, the E key will have no effect for the demo, since the
     outputs are displayed each epoch anyway, but when you are running
     your own programs, you may choose to display outputs only each 10
     or each 100 epochs in order to increase the speed of operation.
     The E key lets you look at the outputs any time you want in between
     normal output displays.

                              MUSIC FILES

     The files in the MUSIC subdirectory are those needed for the music
     composition process described in Chapter 14 of Neural Network PC
     Tools: A Practical Guide.  All of the files, including both
     executable and source code, are in the subdirectory, once you unzip
     them.  Refer to the chapter for their use.

                     INCLUDE AND LIB SUBDIRECTORIES

     The INCLUDE and LIB subdirectories contain files that you'll need
     if you recompile the Kohonen network code.  Otherwise, you can
     probably ignore these files.

