Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!news.duke.edu!concert!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: New SAS macros, jargon, and kangaroos on ftp
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <CyDAyq.9xr@unx.sas.com>
Date: Fri, 28 Oct 1994 05:23:14 GMT
Nntp-Posting-Host: hotellng.unx.sas.com
Organization: SAS Institute Inc.
Lines: 144


There are some new SAS macros (TNN and NETIML) for feedforward neural
nets on our ftp server. The jargon list and kangaroos are also there.
The macros are free but won't do you any good unless you have licensed
the SAS products mentioned below. If you want information about
licensing SAS products, call 919 677-8000 and ask for Software Sales.
BTW, I will be out of town next week and not answering email.
.......................................................................
Neural Networks with SAS Software                         Oct 25, 1994

The following files are available by anonymous ftp from ftp.sas.com
(Internet gateway IP 192.35.83.8) in the directory /pub/sugi19/neural :

 README         This document.

 neural1.ps     Sarle, W.S. (1994), "Neural Networks and Statistical
                Models," Proceedings of the Nineteenth Annual SAS Users
                Group International Conference, Cary, NC: SAS Institute,
                pp 1538-1550. (Postscript file)

 neural2.ps     Sarle, W.S. (1994), "Neural Network Implementation in
                SAS Software," Proceedings of the Nineteenth Annual SAS
                Users Group International Conference, Cary, NC: SAS
                Institute, pp 1551-1573. (Slightly revised version,
                postscript file)

 plots.ps       Plots from the 2nd paper in high-resolution graphics.
                (Postscript file)

 macros.sas     Macros from the 2nd paper.

 example.sas    Examples using the macros with the XOR and sine data.
 example.bls    Output from example.sas.

 example2.sas   Examples using the macros with the motorcycle data.
 example2.bls   Output from example2.sas.

 tnn1.sas       The TNN system of macros for feedforward neural nets.
 tnn1.doc       Introductory documentation for TNN.
 tnn1.ref       Reference guide to TNN macros and arguments
 tnn1ex.sas     Examples using TNN with the XOR, iris, and sine data.
 tnn1ex.bls     Output from tnn1ex.sas.
 tnn1exm.sas    Examples using TNN with the motorcycle data.
 tnn1exm.bls    Output from tnn1ex.sas.

 netiml.sas     The NETIML system of IML modules and macros for
                multilayer perceptrons.
 netiml.ps      Documentation for netiml.sas.
 netimlex.sas   Examples using netiml.sas
 netimlex.bls   Output from netimlex.sas.

 jargon         Translations of neural network and statistical jargon.

 kangaroos      Nontechnical explanation of training methods and
                nonlinear optimization (plain ascii version of
                material from neural2.ps, plus related posts from
                the comp.ai.neural-nets newsgroup on Usenet).

Please note that postscript files (those with a .ps extension) require
a postscript printer or viewer in order for you to read them.

The macros in macros.sas are as follows:

           Product
 Macro     Required   Purpose
 -----     --------   -------
 NETNLP    SAS/OR     Train a multilayer perceptron with one hidden
                      layer by least-squares using PROC NLP
 NETRUN    ---        Run a multilayer perceptron with a DATA step

 NETMODEL  SAS/ETS    Train and run a multilayer perceptron with one
                      hidden layer by least-squares using PROC MODEL,
                      including time-series forecasts

 CPROP1    SAS/STAT   Train a unidirectional counterprop network
 CPROP2    SAS/STAT   Train a bidirectional counterprop network
 CPRUN     SAS/STAT   Run a counterprop network

 RBF       SAS/STAT   Train a radial basis function network
 RBFRUN    ---        Run a radial basis function network

These macros run under release 6.07.03 or later. They have not been
tested in earlier releases. NETNLP will not run in releases prior to
6.07.03 because of the absence of PROC NLP. You should use the latest
release available to you because of performance enhancements in the
NLP and MODEL procedures.  These macros are intended as examples that
you can modify and extend in various ways to meet your particular needs.
They have been kept as simple as possible while still being of practical
use. Various useful features such as missing value handling have been
omitted to keep the macros (relatively) easy to understand

TNN is a much more elaborate system of macros for feedforward neural
nets including a variety of built-in activation and loss functions,
multiple hidden layers, direct input-output connections, missing value
handling, categorical variables, standardization of inputs and targets,
and multiple preliminary optimizations from random initial values to
avoid local minima.  TNN requires the SAS/OR product in release 6.08 or
later. Release 6.10 or later is strongly recommended.

NETIML is a collection of SAS/IML modules and macros for training and
running multilayer perceptrons with a variety of activation and loss
functions. NETIML requires the SAS/IML product in release 6.08 or
later. NETIML is slower than the macros using the NLP or MODEL
procedures.

For ill-conditioned nonlinear models such as MLPs with hidden layers,
results from different procedures, different releases of SAS software,
or different machines will often not be identical. Differences in the
iteration histories and values of the weights are to be expected.
However, the outputs (predicted values) should be similar, at least to 2
or 3 significant digits, except where local minima are encountered.

Various other types of neural networks are similar or identical to
standard statistical methods. For example:

   Second- and higher-order nets are linear models or generalized
   linear models with interaction terms. They can can be implemented
   directly with PROCs GENMOD, GLM, or RSREG.

   Functional-link nets are linear models or generalized linear models
   that include various transformations of the predictor variables.
   They can be implemented with PROC TRANSREG or with a DATA step to
   do the transformations followed by PROCs GENMOD, GLM, LOGISTIC,
   or REG.

   AVQ (Adaptive Vector Quantization) is a class of nonconvergent
   algorithms for least-squares cluster analysis. Better results can
   be obtained with PROC FASTCLUS or CLUSTER.

   Probabilistic neural nets are identical to kernel discriminant
   analysis, which can be done with PROC DISCRIM.

   General regression neural nets are identical to Nadaraya-Watson
   kernel regression, which can be done in the unidimensional case
   with SAS/Insight software. For the multidimensional case, the
   RBF macro produces similar results.

Please contact Warren Sarle (saswss@unx.sas.com) if you have other
questions about how to do neural nets with SAS software.
-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
