Newsgroups: comp.ai.neural-nets,comp.answers,news.answers
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!gatech!enews.sgi.com!news.sgi.com!news.msfc.nasa.gov!newsfeed.internetmci.com!howland.erols.net!news.sprintlink.net!news-stk-200.sprintlink.net!news.sprintlink.net!news-chi-13.sprintlink.net!interpath!news.interpath.net!sas!newshost.unx.sas.com!hotellng.unx.sas.com!saswss
From: saswss@unx.sas.com (Warren Sarle)
Subject: comp.ai.neural-nets FAQ, Part 4 of 7: Books, data, etc.
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <nn4.posting_841287628@hotellng.unx.sas.com>
Supersedes: <nn4.posting_838609226@hotellng.unx.sas.com>
Approved: news-answers-request@MIT.EDU
Date: Thu, 29 Aug 1996 03:00:29 GMT
Expires: Thu, 3 Oct 1996 03:00:28 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
Reply-To: saswss@unx.sas.com (Warren Sarle)
Organization: SAS Institute Inc., Cary, NC, USA
Keywords: frequently asked questions, answers
Followup-To: comp.ai.neural-nets
Lines: 1384
Xref: glinda.oz.cs.cmu.edu comp.ai.neural-nets:33203 comp.answers:20739 news.answers:80409

Archive-name: ai-faq/neural-nets/part4
Last-modified: 1996-08-15
URL: ftp://ftp.sas.com/pub/neural/FAQ4.html
Maintainer: saswss@unx.sas.com (Warren S. Sarle)

This is part 4 (of 7) of a monthly posting to the Usenet newsgroup
comp.ai.neural-nets. See the part 1 of this posting for full information
what it is all about.

========== Questions ========== 
********************************

Part 1: Introduction
Part 2: Learning
Part 3: Generalization
Part 4: Books, data, etc.

   Good literature about Neural Networks?
   Journals and magazines about Neural Networks?
   The most important conferences concerned with Neural Networks?
   Neural Network Associations?
   Other sources of information about NNs?
   Databases for experimentation with NNs?

Part 5: Free software
Part 6: Commercial software
Part 7: Hardware

------------------------------------------------------------------------

Subject: Good literature about Neural Networks?
===============================================

The Best
++++++++

The best popular introduction to NNs
------------------------------------

Hinton, G.E. (1992), "How Neural Networks Learn from Experience", Scientific
American, 267 (September), 144-151. 

The best elementary textbooks on using NNs
------------------------------------------

Smith, M. (1993). Neural Networks for Statistical Modeling, NY: Van Nostrand
Reinhold. 
Smith is not a statistician, but he tries. The book has entire brief
chapters on overfitting and validation (early stopping and split-sample
sample validation, which he incorrectly calls cross-validation), putting it
a rung above most other introductions to NNs. There are also brief chapters
on data preparation and diagnostic plots, topics usually ignored in
elementaty NN books. Only feedforward nets are covered in any detail. 

Weiss, S.M. & Kulikowski, C.A. (1991), Computer Systems That Learn,
Morgan Kaufmann. ISBN 1 55860 065 5. 
Briefly covers at a very elementary level feedforward nets, linear and
nearest-neighbor discriminant analysis, trees, and expert sytems. For a book
at this level, it has an unusually good chapter on estimating generalization
error, including bootstrapping. 

The best elementary textbook on using and programming NNs
---------------------------------------------------------

Masters, Timothy (1994). Practical Neural Network Recipes in C++, Academic
Press, ISBN 0-12-479040-2, US $45 incl. disks.
Masters has written three exceptionally good books on NNs (the two others
are listed below). He combines generally sound practical advice with some
basic statistical knowledge to produce a programming text that is far
superior to the competition (see "The Worst" below). 

The best intermediate textbooks on NNs
--------------------------------------

Bishop, C.M. (1995). Neural Networks for Pattern Recognition, Oxford:
Oxford University Press. ISBN 0-19-853849-9 (hardback) or 0-19-853864-2
(paperback), xvii+482 pages.
This is definitely the best book on neural nets for practical applications
(rather than for neurobiological models). It is the only textbook on neural
nets that I have seen that is statistically solid.
"Bishop is a leading researcher who has a deep understanding of the material
and has gone to great lengths to organize it in a sequence that makes sense.
He has wisely avoided the temptation to try to cover everything and has
therefore omitted interesting topics like reinforcement learning, Hopfield
networks, and Boltzmann machines in order to focus on the types of neural
networks that are most widely used in practical applications. He assumes
that the reader has the basic mathematical literacy required for an
undergraduate science degree, and using these tools he explains everything
from scratch. Before introducing the multilayer perceptron, for example, he
lays a solid foundation of basic statistical concepts. So the crucial
concept of overfitting is introduced using easily visualized examples of
one-dimensional polynomials and only later applied to neural networks. An
impressive aspect of this book is that it takes the reader all the way from
the simplest linear models to the very latest Bayesian multilayer neural
networks without ever requiring any great intellectual leaps." -Geoffrey
Hinton, from the foreword. 

Hertz, J., Krogh, A., and Palmer, R. (1991). Introduction to the Theory of
Neural Computation. Addison-Wesley: Redwood City, California. ISBN
0-201-50395-6 (hardbound) and 0-201-51560-1 (paperbound)
"My first impression is that this one is by far the best book on the topic.
And it's below $30 for the paperback."; "Well written, theoretical (but not
overwhelming)"; It provides a good balance of model development,
computational algorithms, and applications. The mathematical derivations are
especially well done"; "Nice mathematical analysis on the mechanism of
different learning algorithms"; "It is NOT for mathematical beginner. If you
don't have a good grasp of higher level math, this book can be really tough
to get through."

The best advanced textbook covering NNs
---------------------------------------

Ripley, B.D. (1996) Pattern Recognition and Neural Networks, Cambridge:
Cambridge University Press, ISBN 0-521-46086-7 (hardback), xii+403 pages.
Brian Ripley's new book is an excellent sequel to Bishop (1995). Ripley
starts up where Bishop left off, with Bayesian inference and statistical
decision theory, and then covers some of the same material on NNs as Bishop
but at a higher mathematical level. Ripley also covers a variety of methods
that are not discussed, or discussed only briefly, by Bishop, such as
tree-based methods and belief networks. While Ripley is best appreciated by
people with a background in mathematical statistics, the numerous realistic
examples in his book will be of interest even to beginners in neural nets.

The best books on image and signal processing with NNs
------------------------------------------------------

Masters, T. (1994), Signal and Image Processing with Neural Networks: A
C++ Sourcebook, NY: Wiley.

Cichocki, A. and Unbehauen, R. (1993). Neural Networks for Optimization
and Signal Processing. NY: John Wiley & Sons, ISBN 0-471-930105 (hardbound),
526 pages, $57.95. 
"Partly a textbook and partly a research monograph; introduces the basic
concepts, techniques, and models related to neural networks and
optimization, excluding rigorous mathematical details. Accessible to a wide
readership with a differential calculus background. The main coverage of the
book is on recurrent neural networks with continuous state variables. The
book title would be more appropriate without mentioning signal processing.
Well edited, good illustrations."

The best book on time-series forecasting with NNs
-------------------------------------------------

Weigend, A.S. and Gershenfeld, N.A., eds. (1994) Time Series Prediction:
Forecasting the Future and Understanding the Past, Addison-Wesley: Reading,
MA. 

The best book on neurofuzzy systems
-----------------------------------

Brown, M., and Harris, C. (1994), Neurofuzzy Adaptive Modelling and
Control, NY: Prentice Hall. 

The best comparison of NNs with other classification methods
------------------------------------------------------------

Michie, D., Spiegelhalter, D.J. and Taylor, C.C. (1994), Machine Learning,
Neural and Statistical Classification, Ellis Horwood. 

Books for the Beginner:
+++++++++++++++++++++++

Aleksander, I. and Morton, H. (1990). An Introduction to Neural Computing.
Chapman and Hall. (ISBN 0-412-37780-2). 
Comments: "This book seems to be intended for the first year of university
education."

Beale, R. and Jackson, T. (1990). Neural Computing, an Introduction. Adam
Hilger, IOP Publishing Ltd : Bristol. (ISBN 0-85274-262-2). 
Comments: "It's clearly written. Lots of hints as to how to get the adaptive
models covered to work (not always well explained in the original sources).
Consistent mathematical terminology. Covers perceptrons,
error-backpropagation, Kohonen self-org model, Hopfield type models, ART,
and associative memories."

Caudill, M. and Butler, C. (1990). Naturally Intelligent Systems. MIT Press:
Cambridge, Massachusetts. (ISBN 0-262-03156-6). 
The authors try to translate mathematical formulas into English. The results
are likely to disturb people who appreciate either mathematics or English.
Have the authors never heard that "a picture is worth a thousand words"?
What few diagrams they have (such as the one on p. 74) tend to be confusing.
Their jargon is peculiar even by NN standards. As is evident from claims
such as (p. 202): 

   Unlike the backpropagation network, a counterpropagation network
   cannot be fooled into finding a local minimum solution. This means
   that the network is guaranteed to find the correct response ... to an
   input, no matter what. 

the authors do not understand elementary properties of error functions and
optimization algorithms. Like most introductory books, this one neglects the
difficulties of getting good generalization--the authors simply declare (p.
8) that "A neural network is able to generalize"! 

Chester, M. (1993). Neural Networks: A Tutorial, Englewood Cliffs, NJ: PTR
Prentice Hall. 
Shallow, sometimes confused, especially with regard to Kohonen networks. 

Dayhoff, J. E. (1990). Neural Network Architectures: An Introduction. Van
Nostrand Reinhold: New York. 
Comments: "Like Wasserman's book, Dayhoff's book is also very easy to
understand".

Fausett, L. V. (1994). Fundamentals of Neural Networks: Architectures,
Algorithms and Applications, Prentice Hall, ISBN 0-13-334186-0. Also
published as a Prentice Hall International Edition, ISBN 0-13-042250-9.
Sample softeware (source code listings in C and Fortran) is included in an
Instructor's Manual.
"Intermediate in level between Wasserman and Hertz/Krogh/Palmer. Algorithms
for a broad range of neural networks, including a chapter on Adaptive
Resonance Theory with ART2. Simple examples for each network."

Freeman, James (1994). Simulating Neural Networks with Mathematica,
Addison-Wesley, ISBN: 0-201-56629-X. Helps the reader make his own NNs. The
mathematica code for the programs in the book is also available through the
internet: Send mail to MathSource@wri.com or try http://www.wri.com/ on the
World Wide Web.

Freeman, J.A. and Skapura, D.M. (1991). Neural Networks: Algorithms,
Applications, and Programming Techniques, Reading, MA: Addison-Wesley. 
A good book for beginning programmers who want to learn how to write NN
programs while avoiding any understanding of what NNs do or why they do it. 

Gately, E. (1996). Neural Networks for Financial Forecasting. New York:
John Wiley and Sons, Inc.
Franco Insana comments:

* Decent book for the neural net beginner
* Very little devoted to statistical framework, although there 
    is some formulation of backprop theory
* Some food for thought
* Nothing here for those with any neural net experience

Hecht-Nielsen, R. (1990). Neurocomputing. Addison Wesley. 
Comments: "A good book", "comprises a nice historical overview and a chapter
about NN hardware. Well structured prose. Makes important concepts clear."

McClelland, J. L. and Rumelhart, D. E. (1988). Explorations in Parallel
Distributed Processing: Computational Models of Cognition and Perception
(software manual). The MIT Press. 
Comments: "Written in a tutorial style, and includes 2 diskettes of NN
simulation programs that can be compiled on MS-DOS or Unix (and they do too
!)"; "The programs are pretty reasonable as an introduction to some of the
things that NNs can do."; "There are *two* editions of this book. One comes
with disks for the IBM PC, the other comes with disks for the Macintosh".

McCord Nelson, M. and Illingworth, W.T. (1990). A Practical Guide to Neural
Nets. Addison-Wesley Publishing Company, Inc. (ISBN 0-201-52376-0). 
Lots of applications without technical details, lots of hype, lots of goofs,
no formulas.

Muller, B., Reinhardt, J., Strickland, M. T. (1995). Neural Networks. An
Introduction (2nd ed.). Berlin, Heidelberg, New York: Springer-Verlag. ISBN
3-540-60207-0. (DOS 3.5" disk included.) 
Comments: The book was developed out of a course on neural-network models
with computer demonstrations that was taught by the authors to Physics
students. The book comes together with a PC-diskette. The book is divided
into three parts: (1) Models of Neural Networks; describing several
architectures and learing rules, including the mathematics. (2) Statistical
Physiscs of Neural Networks; "hard-core" physics section developing formal
theories of stochastic neural networks. (3) Computer Codes; explanation
about the demonstration programs. First part gives a nice introduction into
neural networks together with the formulas. Together with the demonstration
programs a 'feel' for neural networks can be developed.

Orchard, G.A. & Phillips, W.A. (1991). Neural Computation: A Beginner's
Guide. Lawrence Earlbaum Associates: London. 
Comments: "Short user-friendly introduction to the area, with a
non-technical flavour. Apparently accompanies a software package, but I
haven't seen that yet".

Rao, V.B & H.V. (1993). C++ Neural Networks and Fuzzy Logic. MIS:Press,
ISBN 1-55828-298-x, US $45 incl. disks. 
"Probably not 'leading edge' stuff but detailed enough to get your hands
dirty!"

Swingler , K. (1996), Applying Neural Networks: A Practical Guide, London:
Academic Press. 
This book has lots of good advice liberally sprinkled with errors, some bad
advice, and the occasional howler. Experts will learn nothing, while
beginners will be unable to separate the useful information from the
dangerous. The most ludicrous thing I've found in the book is the claim that
Hecht-Neilson used Kolmogorov's theorem to show that "you will never require
more than twice the number of hidden units as you have inputs" (p. 53) in an
MLP with one hidden layer. Hecht-Neilson has made an occasional published
mistake himself, but I am sure he has never said anything this idiotic! Then
Swingler goes on to say that Kurkova, V. (1991), "Kolmogorov's theorem is
relevant," Neural Computation, 3, 617-622, confirmed this alleged upper
bound on the number of hidden units--this is a gross insult to Kurkova! 

Wasserman, P. D. (1989). Neural Computing: Theory & Practice. Van Nostrand
Reinhold: New York. (ISBN 0-442-20743-3) 
Comments: "Wasserman flatly enumerates some common architectures from an
engineer's perspective ('how it works') without ever addressing the
underlying fundamentals ('why it works') - important basic concepts such as
clustering, principal components or gradient descent are not treated. It's
also full of errors, and unhelpful diagrams drawn with what appears to be
PCB board layout software from the '70s. For anyone who wants to do active
research in the field I consider it quite inadequate"; "Okay, but too
shallow"; "Quite easy to understand"; "The best bedtime reading for Neural
Networks. I have given this book to numerous collegues who want to know NN
basics, but who never plan to implement anything. An excellent book to give
your manager."

The Classics:
+++++++++++++

Kohonen, T. (1984). Self-organization and Associative Memory.
Springer-Verlag: New York. (2nd Edition: 1988; 3rd edition: 1989). 
Comments: "The section on Pattern mathematics is excellent."

Rumelhart, D. E. and McClelland, J. L. (1986). Parallel Distributed
Processing: Explorations in the Microstructure of Cognition (volumes 1 & 2).
The MIT Press. 
Comments: "As a computer scientist I found the two Rumelhart and McClelland
books really heavy going and definitely not the sort of thing to read if you
are a beginner."; "It's quite readable, and affordable (about $65 for both
volumes)."; "THE Connectionist bible".

Introductory Journal Articles:
++++++++++++++++++++++++++++++

Hinton, G. E. (1989). Connectionist learning procedures. Artificial
Intelligence, Vol. 40, pp. 185--234. 
Comments: "One of the better neural networks overview papers, although the
distinction between network topology and learning algorithm is not always
very clear. Could very well be used as an introduction to neural networks."

Knight, K. (1990). Connectionist, Ideas and Algorithms. Communications of
the ACM. November 1990. Vol.33 nr.11, pp 59-74. 
Comments:"A good article, while it is for most people easy to find a copy of
this journal."

Kohonen, T. (1988). An Introduction to Neural Computing. Neural Networks,
vol. 1, no. 1. pp. 3-16. 
Comments: "A general review".

Rumelhart, D. E., Hinton, G. E. and Williams, R. J. (1986). Learning
representations by back-propagating errors. Nature, vol 323 (9 October), pp.
533-536. 
Comments: "Gives a very good potted explanation of backprop NN's. It gives
sufficient detail to write your own NN simulation."

Not-quite-so-introductory Literature:
+++++++++++++++++++++++++++++++++++++

Anderson, J. A. and Rosenfeld, E. (Eds). (1988). Neurocomputing:
Foundations of Research. The MIT Press: Cambridge, MA. 
Comments: "An expensive book, but excellent for reference. It is a
collection of reprints of most of the major papers in the field." 

Anderson, J. A., Pellionisz, A. and Rosenfeld, E. (Eds). (1990). 
Neurocomputing 2: Directions for Research. The MIT Press: Cambridge, MA. 
Comments: "The sequel to their well-known Neurocomputing book."

Bourlard, H.A., and Morgan, N. (1994), Connectionist Speech Recognition: A
Hybrid Approach, Boston: Kluwer Academic Publishers.

Deco, G. and Obradovic, D. (1996), An Information-Theoretic Approach to
Neural Computing, NY: Springer-Verlag. 

Haykin, S. (1994). Neural Networks, a Comprehensive Foundation.
Macmillan, New York, NY.
"A very readable, well written intermediate text on NNs Perspective is
primarily one of pattern recognition, estimation and signal processing.
However, there are well-written chapters on neurodynamics and VLSI
implementation. Though there is emphasis on formal mathematical models of
NNs as universal approximators, statistical estimators, etc., there are also
examples of NNs used in practical applications. The problem sets at the end
of each chapter nicely complement the material. In the bibliography are over
1000 references."

Khanna, T. (1990). Foundations of Neural Networks. Addison-Wesley: New
York. 
Comments: "Not so bad (with a page of erroneous formulas (if I remember
well), and #hidden layers isn't well described)."; "Khanna's intention in
writing his book with math analysis should be commended but he made several
mistakes in the math part".

Kung, S.Y. (1993). Digital Neural Networks, Prentice Hall, Englewood
Cliffs, NJ.

Levine, D. S. (1990). Introduction to Neural and Cognitive Modeling.
Lawrence Erlbaum: Hillsdale, N.J. 
Comments: "Highly recommended".

Lippmann, R. P. (April 1987). An introduction to computing with neural nets.
IEEE Acoustics, Speech, and Signal Processing Magazine. vol. 2, no. 4, pp
4-22. 
Comments: "Much acclaimed as an overview of neural networks, but rather
inaccurate on several points. The categorization into binary and continuous-
valued input neural networks is rather arbitrary, and may work confusing for
the unexperienced reader. Not all networks discussed are of equal
importance."

Maren, A., Harston, C. and Pap, R., (1990). Handbook of Neural Computing
Applications. Academic Press. ISBN: 0-12-471260-6. (451 pages) 
Comments: "They cover a broad area"; "Introductory with suggested
applications implementation".

Masters, T. (1995) Advanced Algorithms for Neural Networks: A C++
Sourcebook, NY: John Wiley and Sons, ISBN 0-471-10588-0
Clear explanations of conjugate gradient and Levenberg-Marquardt
optimization algorithms, simulated annealing, kernel regression (GRNN) and
discriminant analysis (PNN), Gram-Charlier networks, dimensionality
reduction, cross-validation, and bootstrapping. 

Pao, Y. H. (1989). Adaptive Pattern Recognition and Neural Networks
Addison-Wesley Publishing Company, Inc. (ISBN 0-201-12584-6) 
Comments: "An excellent book that ties together classical approaches to
pattern recognition with Neural Nets. Most other NN books do not even
mention conventional approaches."

Refenes, A. (Ed.) (1995). Neural Networks in the Capital Markets.
Chichester, England: John Wiley and Sons, Inc.
Franco Insana comments:

* Not for the beginner
* Excellent introductory material presented by editor in first 5 
  chapters, which could be a valuable reference source for any 
  practitioner
* Very thought-provoking
* Mostly backprop-related
* Most contributors lay good statistical foundation
* Overall, a wealth of information and ideas, but the reader has to 
  sift through it all to come away with anything useful

Simpson, P. K. (1990). Artificial Neural Systems: Foundations, Paradigms,
Applications and Implementations. Pergamon Press: New York. 
Comments: "Contains a very useful 37 page bibliography. A large number of
paradigms are presented. On the negative side the book is very shallow. Best
used as a complement to other books".

Wasserman, P.D. (1993). Advanced Methods in Neural Computing. Van
Nostrand Reinhold: New York (ISBN: 0-442-00461-3). 
Comments: Several neural network topics are discussed e.g. Probalistic
Neural Networks, Backpropagation and beyond, neural control, Radial Basis
Function Networks, Neural Engineering. Furthermore, several subjects related
to neural networks are mentioned e.g. genetic algorithms, fuzzy logic,
chaos. Just the functionality of these subjects is described; enough to get
you started. Lots of references are given to more elaborate descriptions.
Easy to read, no extensive mathematical background necessary.

Zeidenberg. M. (1990). Neural Networks in Artificial Intelligence. Ellis
Horwood, Ltd., Chichester. 
Comments: "Gives the AI point of view".

Zornetzer, S. F., Davis, J. L. and Lau, C. (1990). An Introduction to Neural
and Electronic Networks. Academic Press. (ISBN 0-12-781881-2) 
Comments: "Covers quite a broad range of topics (collection of
articles/papers )."; "Provides a primer-like introduction and overview for a
broad audience, and employs a strong interdisciplinary emphasis".

Zurada, Jacek M. (1992). Introduction To Artificial Neural Systems.
Hardcover, 785 Pages, 317 Figures, ISBN 0-534-95460-X, 1992, PWS Publishing
Company, Price: $56.75 (includes shipping, handling, and the ANS software
diskette). Solutions Manual available.
"Cohesive and comprehensive book on neural nets; as an engineering-oriented
introduction, but also as a research foundation. Thorough exposition of
fundamentals, theory and applications. Training and recall algorithms appear
in boxes showing steps of algorithms, thus making programming of learning
paradigms easy. Many illustrations and intuitive examples. Winner among NN
textbooks at a senior UG/first year graduate level-[175 problems]."
Contents: Intro, Fundamentals of Learning, Single-Layer & Multilayer
Perceptron NN, Assoc. Memories, Self-organizing and Matching Nets,
Applications, Implementations, Appendix) 

The Worst
+++++++++

   Blum, Adam (1992), Neural Networks in C++, Wiley. 

   Welstead, Stephen T. (1994), Neural Network and Fuzzy Logic
   Applications in C/C++, Wiley. 

Both Blum and Welstead contribute to the dangerous myth that any idiot can
use a neural net by dumping in whatever data are handy and letting it train
for a few days. They both have little or no discussion of generalization,
validation, and overfitting. Neither provides any valid advice on choosing
the number of hidden nodes. If you have ever wondered where these stupid
"rules of thumb" that pop up frequently come from, here's a source for one
of them: 

   "A rule of thumb is for the size of this [hidden] layer to be
   somewhere between the input layer size ... and the output layer size
   ..." Blum, p. 60. 

(John Lazzaro tells me he recently "reviewed a paper that cited this rule of
thumb--and referenced this book! Needless to say, the final version of that
paper didn't include the reference!") 

Blum offers some profound advice on choosing inputs: 

   "The next step is to pick as many input factors as possible that
   might be related to [the target]." 

Blum also shows a deep understanding of statistics: 

   "A statistical model is simply a more indirect way of learning
   correlations. With a neural net approach, we model the problem
   directly." p. 8. 

Blum at least mentions some important issues, however simplistic his advice
may be. Welstead just ignores them. What Welstead gives you is code--vast
amounts of code. I have no idea how anyone could write that much code for a
simple feedforward NN. Welstead's approach to validation, in his chapter on
financial forecasting, is to reserve two cases for the validation set! 

My comments apply only to the text of the above books. I have not examined
or attempted to compile the code. 

------------------------------------------------------------------------

Subject: Journals and magazines about Neural Networks?
======================================================

[to be added: comments on speed of reviewing and publishing,
              whether they accept TeX format or ASCII by e-mail, etc.]

A. Dedicated Neural Network Journals:
+++++++++++++++++++++++++++++++++++++

Title:   Neural Networks
Publish: Pergamon Press
Address: Pergamon Journals Inc., Fairview Park, Elmsford,
         New York 10523, USA and Pergamon Journals Ltd.
         Headington Hill Hall, Oxford OX3, 0BW, England
Freq.:   10 issues/year (vol. 1 in 1988)
Cost/Yr: Free with INNS or JNNS or ENNS membership ($45?),
         Individual $65, Institution $175
ISSN #:  0893-6080
WWW:     http://www.elsevier.nl/locate/inca/841
Remark:  Official Journal of International Neural Network Society (INNS),
         European Neural Network Society (ENNS) and Japanese Neural
         Network Society (JNNS).
         Contains Original Contributions, Invited Review Articles, Letters
         to Editor, Book Reviews, Editorials, Announcements, Software Surveys.

Title:   Neural Computation
Publish: MIT Press
Address: MIT Press Journals, 55 Hayward Street Cambridge,
         MA 02142-9949, USA, Phone: (617) 253-2889
Freq.:   Quarterly (vol. 1 in 1989)
Cost/Yr: Individual $45, Institution $90, Students $35; Add $9 Outside USA
ISSN #:  0899-7667
URL:     http://www-mitpress.mit.edu/jrnls-catalog/neural.html
Remark:  Combination of Reviews (10,000 words), Views (4,000 words)
         and Letters (2,000 words).  I have found this journal to be of
         outstanding quality.
         (Note: Remarks supplied by Mike Plonski "plonski@aero.org")

Title:   IEEE Transactions on Neural Networks
Publish: Institute of Electrical and Electronics Engineers (IEEE)
Address: IEEE Service Cemter, 445 Hoes Lane, P.O. Box 1331, Piscataway, NJ,
         08855-1331 USA. Tel: (201) 981-0060
Cost/Yr: $10 for Members belonging to participating IEEE societies
Freq.:   Quarterly (vol. 1 in March 1990)
URL:     http://www.ieee.org/nnc/pubs/transactions.html
Remark:  Devoted to the science and technology of neural networks
         which disclose significant  technical knowledge, exploratory
         developments and applications of neural networks from biology to
         software to hardware.  Emphasis is on artificial neural networks.
         Specific aspects include self organizing systems, neurobiological
         connections, network dynamics and architecture, speech recognition,
         electronic and photonic implementation, robotics and controls.
         Includes Letters concerning new research results.
         (Note: Remarks are from journal announcement)

Title:   International Journal of Neural Systems
Publish: World Scientific Publishing
Address: USA: World Scientific Publishing Co., 1060 Main Street, River Edge,
         NJ 07666. Tel: (201) 487 9655; Europe: World Scientific Publishing
         Co. Ltd., 57 Shelton Street, London WC2H 9HE, England.
         Tel: (0171) 836 0888; Asia: World Scientific Publishing Co. Pte. Ltd.,
         1022 Hougang Avenue 1 #05-3520, Singapore 1953, Rep. of Singapore
         Tel: 382 5663.
Freq.:   Quarterly (Vol. 1 in 1990)
Cost/Yr: Individual $122, Institution $255 (plus $15-$25 for postage)
ISSN #:  0129-0657 (IJNS)
Remark:  The International Journal of Neural Systems is a quarterly
         journal which covers information processing in natural
         and artificial neural systems. Contributions include research papers,
         reviews, and Letters to the Editor - communications under 3,000
         words in length, which are published within six months of receipt.
         Other contributions are typically published within nine months.
         The journal presents a fresh undogmatic attitude towards this
         multidisciplinary field and aims to be a forum for novel ideas and
         improved understanding of collective and cooperative phenomena with
         computational capabilities.
         Papers should be submitted to World Scientific's UK office. Once a
         paper is accepted for publication, authors are invited to e-mail
         the LaTeX source file of their paper in order to expedite publication.

Title:   International Journal of Neurocomputing
Publish: Elsevier Science Publishers, Journal Dept.; PO Box 211;
         1000 AE Amsterdam, The Netherlands
Freq.:   Quarterly (vol. 1 in 1989)
WWW:     http://www.elsevier.nl/locate/inca/505628

Title:   Neural Processing Letters
Publish: D facto publications
Address: 45 rue Masui; B-1210 Brussels, Belgium
         Phone: (32) 2 245 43 63;  Fax: (32) 2 245 46 94
Freq:    6 issues/year (vol. 1 in September 1994)
Cost/Yr: BEF 4400 (about $140)
ISSN #:  1370-4621
URL:     http://www.dice.ucl.ac.be/neural-nets/NPL/NPL.html
FTP:     ftp://ftp.dice.ucl.ac.be/pub/neural-nets/NPL
Remark:  The aim of the journal is to rapidly publish new ideas, original
         developments and work in progress.  Neural Processing Letters
         covers all aspects of the Artificial Neural Networks field.
         Publication delay is about 3 months.

Title:   Neural Network News
Publish: AIWeek Inc.
Address: Neural Network News, 2555 Cumberland Parkway, Suite 299,
         Atlanta, GA 30339 USA. Tel: (404) 434-2187
Freq.:   Monthly (beginning September 1989)
Cost/Yr: USA and Canada $249, Elsewhere $299
Remark:  Commercial Newsletter

Title:   Network: Computation in Neural Systems
Publish: IOP Publishing Ltd
Address: Europe: IOP Publishing Ltd, Techno House, Redcliffe Way, Bristol
         BS1 6NX, UK; IN USA: American Institute of Physics, Subscriber
         Services 500 Sunnyside Blvd., Woodbury, NY  11797-2999
Freq.:   Quarterly (1st issue 1990)
Cost/Yr: USA: $180,  Europe: 110 pounds
Remark:  Description: "a forum for integrating theoretical and experimental
         findings across relevant interdisciplinary boundaries."  Contents:
         Submitted articles reviewed by two technical referees  paper's
         interdisciplinary format and accessability."  Also Viewpoints and
         Reviews commissioned by the editors, abstracts (with reviews) of
         articles published in other journals, and book reviews.
         Comment: While the price discourages me (my comments are based
         upon a free sample copy), I think that the journal succeeds
         very well.  The highest density of interesting articles I
         have found in any journal.
         (Note: Remarks supplied by kehoe@csufres.CSUFresno.EDU)

Title:   Connection Science: Journal of Neural Computing,
         Artificial Intelligence and Cognitive Research
Publish: Carfax Publishing
Address: Europe: Carfax Publishing Company, PO Box 25, Abingdon, Oxfordshire
         OX14 3UE, UK.
         USA: Carfax Publishing Company, PO Box 2025, Dunnellon, Florida
         34430-2025, USA
         Australia: Carfax Publishing Company, Locked Bag 25, Deakin,
         ACT 2600, Australia
Freq.:   Quarterly (vol. 1 in 1989)
Cost/Yr: Personal rate:
         48 pounds (EC) 66 pounds (outside EC) US$118 (USA and Canada)
         Institutional rate:
         176 pounds (EC) 198 pounds (outside EC) US$340 (USA and Canada)

Title:   International Journal of Neural Networks
Publish: Learned Information
Freq.:   Quarterly (vol. 1 in 1989)
Cost/Yr: 90 pounds
ISSN #:  0954-9889
Remark:  The journal contains articles, a conference report (at least the
         issue I have), news and a calendar.
         (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")

Title:   Sixth Generation Systems (formerly Neurocomputers)
Publish: Gallifrey Publishing
Address: Gallifrey Publishing, PO Box 155, Vicksburg, Michigan, 49097, USA
         Tel: (616) 649-3772, 649-3592 fax
Freq.    Monthly (1st issue January, 1987)
ISSN #:  0893-1585
Editor:  Derek F. Stubbs
Cost/Yr: $79 (USA, Canada), US$95 (elsewhere)
Remark:  Runs eight to 16 pages monthly. In 1995 will go to floppy disc-based
publishing with databases +, "the equivalent to 50 pages per issue are
planned." Often focuses on specific topics: e.g., August, 1994 contains two
articles: "Economics, Times Series and the Market," and "Finite Particle
Analysis - [part] II."  Stubbs also directs the company Advanced Forecasting
Technologies. (Remark by Ed Rosenfeld: ier@aol.com)

Title:   JNNS Newsletter (Newsletter of the Japan Neural Network Society)
Publish: The Japan Neural Network Society
Freq.:   Quarterly (vol. 1 in 1989)
Remark:  (IN JAPANESE LANGUAGE) Official Newsletter of the Japan Neural
         Network Society(JNNS)
         (Note: remarks by Osamu Saito "saito@nttica.NTT.JP")

Title:   Neural Networks Today
Remark:  I found this title in a bulletin board of october last year.
         It was a message of Tim Pattison, timpatt@augean.OZ
         (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")

Title:   Computer Simulations in Brain Science

Title:   Internation Journal of Neuroscience

Title:   Neural Network Computation
Remark:  Possibly the same as "Neural Computation"

Title:   Neural Computing and Applications
Freq.:   Quarterly
Publish: Springer Verlag
Cost/yr: 120 Pounds
Remark:  Is the journal of the Neural Computing Applications Forum.
         Publishes original research and other information
         in the field of practical applications of neural computing.

B. NN Related Journals:
+++++++++++++++++++++++

Title:   Complex Systems
Publish: Complex Systems Publications
Address: Complex Systems Publications, Inc., P.O. Box 6149, Champaign,
         IL 61821-8149, USA
Freq.:   6 times per year (1st volume is 1987)
ISSN #:  0891-2513
Cost/Yr: Individual $75, Institution $225
Remark:  Journal COMPLEX SYSTEMS  devotes to rapid publication of research
         on science, mathematics, and engineering of systems with simple
         components but complex overall behavior. Send mail to
         "jcs@jaguar.ccsr.uiuc.edu" for additional info.
         (Remark is from announcement on Net)

Title:   Biological Cybernetics (Kybernetik)
Publish: Springer Verlag
Remark:  Monthly (vol. 1 in 1961)

Title:   Various IEEE Transactions and Magazines
Publish: IEEE
Remark:  Primarily see IEEE Trans. on System, Man and Cybernetics;
         Various Special Issues: April 1990 IEEE Control Systems
         Magazine.; May 1989 IEEE Trans. Circuits and Systems.;
         July 1988 IEEE Trans. Acoust. Speech Signal Process.

Title:   The Journal of Experimental and Theoretical Artificial Intelligence
Publish: Taylor & Francis, Ltd.
Address: London, New York, Philadelphia
Freq.:   ? (1st issue Jan 1989)
Remark:  For submission information, please contact either of the editors:
         Eric Dietrich                        Chris Fields
         PACSS - Department of Philosophy     Box 30001/3CRL
         SUNY Binghamton                      New Mexico State University
         Binghamton, NY 13901                 Las Cruces, NM 88003-0001
         dietrich@bingvaxu.cc.binghamton.edu  cfields@nmsu.edu

Title:   The Behavioral and Brain Sciences
Publish: Cambridge University Press
Remark:  (Expensive as hell, I'm sure.)
         This is a delightful journal that encourages discussion on a
         variety of controversial topics.  I have especially enjoyed
         reading some papers in there by Dana Ballard and Stephen
         Grossberg (separate papers, not collaborations) a few years
         back.  They have a really neat concept: they get a paper,
         then invite a number of noted scientists in the field to
         praise it or trash it.  They print these commentaries, and
         give the author(s) a chance to make a rebuttal or
         concurrence.  Sometimes, as I'm sure you can imagine, things
         get pretty lively.  I'm reasonably sure they are still at
         it--I think I saw them make a call for reviewers a few
         months ago.  Their reviewers are called something like
         Behavioral and Brain Associates, and I believe they have to
         be nominated by current associates, and should be fairly
         well established in the field.  That's probably more than I
         really know about it but maybe if you post it someone who
         knows more about it will correct any errors I have made.
         The main thing is that I liked the articles I read. (Note:
         remarks by Don Wunsch )

Title:   International Journal of Applied Intelligence
Publish: Kluwer Academic Publishers
Remark:  first issue in 1990(?)

Title:   Bulletin of Mathematical Biology

Title:   Intelligence

Title:   Journal of Mathematical Biology

Title:   Journal of Complex System

Title:   International Journal of Modern Physics C
Publish: USA: World Scientific Publishing Co., 1060 Main Street, River Edge,
         NJ 07666. Tel: (201) 487 9655; Europe: World Scientific Publishing
         Co. Ltd., 57 Shelton Street, London WC2H 9HE, England.
         Tel: (0171) 836 0888; Asia: World Scientific Publishing Co. Pte. Ltd.,
         1022 Hougang Avenue 1 #05-3520, Singapore 1953, Rep. of Singapore
         Tel: 382 5663.
Freq:    bi-monthly
Eds:     H. Herrmann, R. Brower, G.C. Fox and S Nose

Title:   Machine Learning
Publish: Kluwer Academic Publishers
Address: Kluwer Academic Publishers
         P.O. Box 358
         Accord Station
         Hingham, MA 02018-0358 USA
Freq.:   Monthly (8 issues per year; increasing to 12 in 1993)
Cost/Yr: Individual $140 (1992); Member of AAAI or CSCSI $88
Remark:  Description: Machine Learning is an international forum for
         research on computational approaches to learning.  The journal
         publishes articles reporting substantive research results on a
         wide range of learning methods applied to a variety of task
         domains.  The ideal paper will make a theoretical contribution
         supported by a computer implementation.
         The journal has published many key papers in learning theory,
         reinforcement learning, and decision tree methods.  Recently
         it has published a special issue on connectionist approaches
         to symbolic reasoning.  The journal regularly publishes
         issues devoted to genetic algorithms as well.

Title:   INTELLIGENCE - The Future of Computing
Published by: Intelligence
Address: INTELLIGENCE, P.O. Box 20008, New York, NY 10025-1510, USA,
212-222-1123 voice & fax; email: ier@aol.com, CIS: 72400,1013
Freq.    Monthly plus four special reports each year (1st issue: May, 1984)
ISSN #:  1042-4296
Editor:  Edward Rosenfeld
Cost/Yr: $395 (USA), US$450 (elsewhere)
Remark:  Has absorbed several other newsletters, like Synapse/Connection
         and Critical Technology Trends (formerly AI Trends).
         Covers NN, genetic algorithms, fuzzy systems, wavelets, chaos
         and other advanced computing approaches, as well as molecular
         computing and nanotechnology.

Title:   Journal of Physics A: Mathematical and General
Publish: Inst. of Physics, Bristol
Freq:    24 issues per year.
Remark:  Statistical mechanics aspects of neural networks
         (mostly Hopfield models).

Title:   Physical Review A: Atomic, Molecular and Optical Physics
Publish: The American Physical Society (Am. Inst. of Physics)
Freq:    Monthly
Remark:  Statistical mechanics of neural networks.

Title:   Information Sciences
Publish: North Holland (Elsevier Science)
Freq.:   Monthly
ISSN:    0020-0255
Editor:  Paul P. Wang; Department of Electrical Engineering; Duke University;
         Durham, NC 27706, USA

C. Journals loosely related to NNs:
+++++++++++++++++++++++++++++++++++

Title:   JOURNAL OF COMPLEXITY
Remark:  (Must rank alongside Wolfram's Complex Systems)

Title:   IEEE ASSP Magazine
Remark:  (April 1987 had the Lippmann intro. which everyone likes to cite)

Title:   ARTIFICIAL INTELLIGENCE
Remark:  (Vol 40, September 1989 had the survey paper by Hinton)

Title:   COGNITIVE SCIENCE
Remark:  (the Boltzmann machine paper by Ackley et al appeared here
         in Vol 9, 1983)

Title:   COGNITION
Remark:  (Vol 28, March 1988 contained the Fodor and Pylyshyn
         critique of connectionism)

Title:   COGNITIVE PSYCHOLOGY
Remark:  (no comment!)

Title:   JOURNAL OF MATHEMATICAL PSYCHOLOGY
Remark:  (several good book reviews)

------------------------------------------------------------------------

Subject: The most important conferences concerned with
======================================================
Neural Networks?
================

[to be added: has taken place how often yet; most emphasized topics;
 where to get proceedings/calls-for-papers etc. ]

A. Dedicated Neural Network Conferences:
++++++++++++++++++++++++++++++++++++++++

1. Neural Information Processing Systems (NIPS) Annually since 1988 in
   Denver, Colorado; late November or early December. Interdisciplinary
   conference with computer science, physics, engineering, biology,
   medicine, cognitive science topics. Covers all aspects of NNs.
   Proceedings appear several months after the conference as a book from
   Morgan Kaufman, San Mateo, CA. 
2. International Joint Conference on Neural Networks (IJCNN) formerly
   co-sponsored by INNS and IEEE, no longer held. 
3. Annual Conference on Neural Networks (ACNN) 
4. International Conference on Artificial Neural Networks (ICANN) Annually
   in Europe. First was 1991. Major conference of European Neur. Netw. Soc.
   (ENNS) 
5. WCNN. Sponsored by INNS. 
6. European Symposium on Artificial Neural Networks (ESANN). Anually since
   1993 in Brussels, Belgium; late April; conference on the fundamental
   aspects of artificial neural networks: theory, mathematics, biology,
   relations between neural networks and other disciplines, statistics,
   learning, algorithms, models and architectures, self-organization, signal
   processing, approximation of functions, evolutive learning, etc. Contact:
   Michel Verleysen, D facto conference services, 45 rue Masui, B-1210
   Brussels, Belgium, phone: +32 2 245 43 63, fax: + 32 2 245 46 94, e-mail:
   esann@dice.ucl.ac.be 
7. Artificial Neural Networks in Engineering (ANNIE) Anually since 1991 in
   St. Louis, Missouri; held in November. (Topics: NN architectures, pattern
   recognition, neuro-control, neuro-engineering systems. Contact: ANNIE;
   Engineering Management Department; 223 Engineering Management Building;
   University of Missouri-Rolla; Rolla, MO 65401; FAX: (314) 341-6567) 
8. many many more.... 

B. Other Conferences
++++++++++++++++++++

1. International Joint Conference on Artificial Intelligence (IJCAI) 
2. Intern. Conf. on Acustics, Speech and Signal Processing (ICASSP) 
3. Intern. Conf. on Pattern Recognition. Held every other year. Has a
   connectionist subconference. Information: General Chair Walter G.
   Kropatsch <krw@prip.tuwien.ac.at> 
4. Annual Conference of the Cognitive Science Society 
5. [Vision Conferences?] 

C. Pointers to Conferences
++++++++++++++++++++++++++

1. The journal "Neural Networks" has a list of conferences, workshops and
   meetings in each issue. This is quite interdisciplinary. 
2. There is a regular posting on comp.ai.neural-nets from Paultje Bakker:
   "Upcoming Neural Network Conferences", which lists names, dates,
   locations, contacts, and deadlines. It is also available on the WWW from 
   http://www.neuronet.ph.kcl.ac.uk/neuronet/bakker.html. 
3. The IEEE Neural Network Council maintains an up-to-date list of
   conferences at http://www.ieee.org/nnc. 

------------------------------------------------------------------------

Subject: Neural Network Associations?
=====================================

1. International Neural Network Society (INNS).
+++++++++++++++++++++++++++++++++++++++++++++++

   INNS membership includes subscription to "Neural Networks", the official
   journal of the society. Membership is $55 for non-students and $45 for
   students per year. Address: INNS Membership, P.O. Box 491166, Ft.
   Washington, MD 20749. 

2. International Student Society for Neural Networks
++++++++++++++++++++++++++++++++++++++++++++++++++++
   (ISSNNets).
   +++++++++++

   Membership is $5 per year. Address: ISSNNet, Inc., P.O. Box 15661,
   Boston, MA 02215 USA 

3. Women In Neural Network Research and technology
++++++++++++++++++++++++++++++++++++++++++++++++++
   (WINNERS).
   ++++++++++

   Address: WINNERS, c/o Judith Dayhoff, 11141 Georgia Ave., Suite 206,
   Wheaton, MD 20902. Phone: 301-933-9000. 

4. European Neural Network Society (ENNS)
+++++++++++++++++++++++++++++++++++++++++

   ENNS membership includes subscription to "Neural Networks", the official
   journal of the society. Membership is currently (1994) 50 UK pounds (35
   UK pounds for students) per year. Address: ENNS Membership, Centre for
   Neural Networks, King's College London, Strand, London WC2R 2LS, United
   Kingdom. 

5. Japanese Neural Network Society (JNNS)
+++++++++++++++++++++++++++++++++++++++++

   Address: Japanese Neural Network Society; Department of Engineering,
   Tamagawa University; 6-1-1, Tamagawa Gakuen, Machida City, Tokyo; 194
   JAPAN; Phone: +81 427 28 3457, Fax: +81 427 28 3597 

6. Association des Connexionnistes en THese (ACTH)
++++++++++++++++++++++++++++++++++++++++++++++++++

   (the French Student Association for Neural Networks); Membership is 100
   FF per year; Activities: newsletter, conference (every year), list of
   members, electronic forum; Journal 'Valgo' (ISSN 1243-4825); WWW page: 
   http://www.supelec-rennes.fr/acth/welcome.html ; Contact: acth@loria.fr 

7. Neurosciences et Sciences de l'Ingenieur (NSI)
+++++++++++++++++++++++++++++++++++++++++++++++++

   Biology & Computer Science Activity : conference (every year) Address :
   NSI - TIRF / INPG 46 avenue Felix Viallet 38031 Grenoble Cedex FRANCE 

8. IEEE Neural Networks Council
+++++++++++++++++++++++++++++++

   Web page at http://www.ieee.org/nnc 

9. SNN (Foundation for Neural Networks)
+++++++++++++++++++++++++++++++++++++++

   The Foundation for Neural Networks (SNN) is a university based non-profit
   organization that stimulates basic and applied research on neural
   networks in the Netherlands. Every year SNN orgines a symposium on Neural
   Networks. See http://www.mbfys.kun.nl/SNN/. 

You can find nice lists of NN societies in the WWW at 
http://www.emsl.pnl.gov:2080/docs/cie/neural/societies.html and at 
http://www.ieee.org:80/nnc/research/othernnsoc.html. 

------------------------------------------------------------------------

Subject: Other sources of information about NNs?
================================================

1. Backpropagator's Review
++++++++++++++++++++++++++

   One of the best introductory sources is Donald Tveter's World-Wide-Web
   page at http://www.mcs.com/~drt/bprefs.html, which contains both answers
   to additional FAQs and an annotated neural net bibliography emphasizing
   on-line articles. 

2. Neuron Digest
++++++++++++++++

   Internet Mailing List. From the welcome blurb: "Neuron-Digest is a list
   (in digest form) dealing with all aspects of neural networks (and any
   type of network or neuromorphic system)" To subscribe, send email to
   neuron-request@cattell.psych.upenn.edu comp.ai.neural-net readers also
   find the messages in that newsgroup in the form of digests. 

3. Usenet groups comp.ai.neural-nets (Oha!) and
+++++++++++++++++++++++++++++++++++++++++++++++
   comp.theory.self-org-sys.
   +++++++++++++++++++++++++

   There is a periodic posting on comp.ai.neural-nets sent by
   srctran@world.std.com (Gregory Aharonian) about Neural Network patents.

4. Central Neural System Electronic Bulletin Board
++++++++++++++++++++++++++++++++++++++++++++++++++

   Modem: 409-737-5222; Sysop: Wesley R. Elsberry; 4160 Pirates' Beach,
   Galveston, TX 77554; welsberr@orca.tamu.edu. Many MS-DOS PD and shareware
   simulations, source code, benchmarks, demonstration packages, information
   files; some Unix, Macintosh, Amiga related files. Also available are
   files on AI, AI Expert listings 1986-1991, fuzzy logic, genetic
   algorithms, artificial life, evolutionary biology, and many Project
   Gutenberg and Wiretap etexts. No user fees have ever been charged. Home
   of the NEURAL_NET Echo, available thrugh FidoNet, RBBS-Net, and other
   EchoMail compatible bulletin board systems. 

5. Neuroprose ftp archive site: archive.cis.ohio-state.edu
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

   ftp://archive.cis.ohio-state.edu/pub/neuroprose
   This directory contains technical reports as a public service to the
   connectionist and neural network scientific community. 

6. Neural ftp archive site: ftp.funet.fi
++++++++++++++++++++++++++++++++++++++++

   Is administrating a large collection of neural network papers and
   software at the Finnish University Network file archive site ftp.funet.fi
   in directory /pub/sci/neural Contains all the public domain software and
   papers that they have been able to find. All of these files have been
   transferred from FTP sites in U.S. and are mirrored about every 3 months
   at fastest. Contact: neural-adm@ftp.funet.fi 

7. BibTeX data bases of NN journals
+++++++++++++++++++++++++++++++++++

   The Center for Computational Intelligence maintains BibTeX data bases of
   various NN journals, including IEEE Transactions on Neural Networks,
   Machine Learning, Neural Computation, and NIPS, at 
   http://www.ci.tuwien.ac.at/docs/ci/bibtex_collection.html or 
   ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/bib/. 

8. USENET newsgroup comp.org.issnnet
++++++++++++++++++++++++++++++++++++

   Forum for discussion of academic/student-related issues in NNs, as well
   as information on ISSNNet (see question "associations") and its
   activities. 

9. AI CD-ROM
++++++++++++

   Network Cybernetics Corporation produces the "AI CD-ROM". It is an
   ISO-9660 format CD-ROM and contains a large assortment of software
   related to artificial intelligence, artificial life, virtual reality, and
   other topics. Programs for OS/2, MS-DOS, Macintosh, UNIX, and other
   operating systems are included. Research papers, tutorials, and other
   text files are included in ASCII, RTF, and other universal formats. The
   files have been collected from AI bulletin boards, Internet archive
   sites, University computer deptartments, and other government and
   civilian AI research organizations. Network Cybernetics Corporation
   intends to release annual revisions to the AI CD-ROM to keep it up to
   date with current developments in the field. The AI CD-ROM includes
   collections of files that address many specific AI/AL topics including
   Neural Networks (Source code and executables for many different platforms
   including Unix, DOS, and Macintosh. ANN development tools, example
   networks, sample data, tutorials. A complete collection of Neural Digest
   is included as well.) The AI CD-ROM may be ordered directly by check,
   money order, bank draft, or credit card from: Network Cybernetics
   Corporation; 4201 Wingren Road Suite 202; Irving, TX 75062-2763; Tel
   214/650-2002; Fax 214/650-1929; The cost is $129 per disc + shipping
   ($5/disc domestic or $10/disc foreign) (See the comp.ai FAQ for further
   details) 

10. NN events server
++++++++++++++++++++

   There is a WWW page for Announcements of Conferences, Workshops and Other
   Events on Neural Networks at IDIAP in Switzerland. WWW-Server: 
   http://www.idiap.ch/html/idiap-networks.html. 

11. World Wide Web
++++++++++++++++++

   In World-Wide-Web (WWW, for example via the xmosaic program) you can read
   neural network information for instance by opening one of the following
   uniform resource locators (URLs): http://www.neuronet.ph.kcl.ac.uk
   (NEuroNet, King's College, London), http://www.eeb.ele.tue.nl (Eindhoven,
   Netherlands), http://www.emsl.pnl.gov:2080/docs/cie/neural/ (Richland,
   Washington), http://www.cosy.sbg.ac.at/~rschwaig/rschwaig/projects.html
   (Salzburg, Austria), http://http2.sils.umich.edu/Public/nirg/nirg1.html
   (Michigan), http://www.lpac.ac.uk/SEL-HPC/Articles/NeuralArchive.html
   (London), http://rtm.science.unitn.it/ Reactive Memory Search (Tabu
   Search) page (Trento, Italy), http://www.wi.leidenuniv.nl/art/ (ART WWW
   site, Leiden, Netherlands), 
   Many others are available too; WWW is changing all the time. 

12. Neurosciences Internet Resource Guide
+++++++++++++++++++++++++++++++++++++++++

   This document aims to be a guide to existing, free, Internet-accessible
   resources helpful to neuroscientists of all stripes. An ASCII text
   version (86K) is available in the Clearinghouse of Subject-Oriented
   Internet Resource Guides as follows:

   ftp://una.hh.lib.umich.edu/inetdirsstacks/neurosci:cormbonario, 
   gopher://una.hh.lib.umich.edu/00/inetdirsstacks/neurosci:cormbonario, 
   http://http2.sils.umich.edu/Public/nirg/nirg1.html. 

13. Academic programs list
++++++++++++++++++++++++++

   Rutvik Desai <rutvik@c3serve.c3.lanl.gov> has a compilation of acedemic
   programs offering interdeciplinary studies in computational neuroscience,
   AI, cognitive psychology etc. at 
   http://www.cs.indiana.edu/hyplan/rudesai/cogsci-prog.html 

   Links to neurosci, psychology, linguistics lists are also provided. 

14. INTCON mailing list
+++++++++++++++++++++++

   INTCON (Intelligent Control) is a moderated mailing list set up to
   provide a forum for communication and exchange of ideas among researchers
   in neuro-control, fuzzy logic control, reinforcement learning and other
   related subjects grouped under the topic of intelligent control. Send
   your subscribe requests to intcon-request@phoenix.ee.unsw.edu.au 

------------------------------------------------------------------------

Subject: Databases for experimentation with NNs?
================================================

1. The neural-bench Benchmark collection
++++++++++++++++++++++++++++++++++++++++

   Accessible via anonymous FTP on ftp.cs.cmu.edu [128.2.206.173] in
   directory /afs/cs/project/connect/bench. In case of problems or if you
   want to donate data, email contact is "neural-bench@cs.cmu.edu". The data
   sets in this repository include the 'nettalk' data, 'two spirals',
   protein structure prediction, vowel recognition, sonar signal
   classification, and a few others. 

2. Proben1
++++++++++

   Proben1 is a collection of 12 learning problems consisting of real data.
   The datafiles all share a single simple common format. Along with the
   data comes a technical report describing a set of rules and conventions
   for performing and reporting benchmark tests and their results.
   Accessible via anonymous FTP on ftp.cs.cmu.edu [128.2.206.173] as 
   /afs/cs/project/connect/bench/contrib/prechelt/proben1.tar.gz. and also
   on ftp.ira.uka.de [129.13.10.90] as /pub/neuron/proben.tar.gz. The file
   is about 1.8 MB and unpacks into about 20 MB. 

3. UCI machine learning database
++++++++++++++++++++++++++++++++

   Accessible via anonymous FTP on ics.uci.edu [128.195.1.1] in directory 
   /pub/machine-learning-databases". 

4. NIST special databases of the National Institute Of Standards
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
   And Technology:
   +++++++++++++++

   Several large databases, each delivered on a CD-ROM. Here is a quick
   list. 
    o NIST Binary Images of Printed Digits, Alphas, and Text 
    o NIST Structured Forms Reference Set of Binary Images 
    o NIST Binary Images of Handwritten Segmented Characters 
    o NIST 8-bit Gray Scale Images of Fingerprint Image Groups 
    o NIST Structured Forms Reference Set 2 of Binary Images 
    o NIST Test Data 1: Binary Images of Hand-Printed Segmented Characters 
    o NIST Machine-Print Database of Gray Scale and Binary Images 
    o NIST 8-Bit Gray Scale Images of Mated Fingerprint Card Pairs 
    o NIST Supplemental Fingerprint Card Data (SFCD) for NIST Special
      Database 9 
    o NIST Binary Image Databases of Census Miniforms (MFDB) 
    o NIST Mated Fingerprint Card Pairs 2 (MFCP 2) 
    o NIST Scoring Package Release 1.0 
    o NIST FORM-BASED HANDPRINT RECOGNITION SYSTEM 
   Here are example descriptions of two of these databases: 

   NIST special database 2: Structured Forms Reference Set (SFRS)
   --------------------------------------------------------------

   The NIST database of structured forms contains 5,590 full page images of
   simulated tax forms completed using machine print. THERE IS NO REAL TAX
   DATA IN THIS DATABASE. The structured forms used in this database are 12
   different forms from the 1988, IRS 1040 Package X. These include Forms
   1040, 2106, 2441, 4562, and 6251 together with Schedules A, B, C, D, E, F
   and SE. Eight of these forms contain two pages or form faces making a
   total of 20 form faces represented in the database. Each image is stored
   in bi-level black and white raster format. The images in this database
   appear to be real forms prepared by individuals but the images have been
   automatically derived and synthesized using a computer and contain no
   "real" tax data. The entry field values on the forms have been
   automatically generated by a computer in order to make the data available
   without the danger of distributing privileged tax information. In
   addition to the images the database includes 5,590 answer files, one for
   each image. Each answer file contains an ASCII representation of the data
   found in the entry fields on the corresponding image. Image format
   documentation and example software are also provided. The uncompressed
   database totals approximately 5.9 gigabytes of data. 

   NIST special database 3: Binary Images of Handwritten Segmented
   ---------------------------------------------------------------
   Characters (HWSC)
   -----------------

   Contains 313,389 isolated character images segmented from the 2,100
   full-page images distributed with "NIST Special Database 1". 223,125
   digits, 44,951 upper-case, and 45,313 lower-case character images. Each
   character image has been centered in a separate 128 by 128 pixel region,
   error rate of the segmentation and assigned classification is less than
   0.1%. The uncompressed database totals approximately 2.75 gigabytes of
   image data and includes image format documentation and example software.

   The system requirements for all databases are a 5.25" CD-ROM drive with
   software to read ISO-9660 format. Contact: Darrin L. Dimmick;
   dld@magi.ncsl.nist.gov; (301)975-4147

   The prices of the databases are between US$ 250 and 1895 If you wish to
   order a database, please contact: Standard Reference Data; National
   Institute of Standards and Technology; 221/A323; Gaithersburg, MD 20899;
   Phone: (301)975-2208; FAX: (301)926-0416

   Samples of the data can be found by ftp on sequoyah.ncsl.nist.gov in
   directory /pub/data A more complete description of the available
   databases can be obtained from the same host as 
   /pub/databases/catalog.txt 

5. CEDAR CD-ROM 1: Database of Handwritten Cities, States,
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
   ZIP Codes, Digits, and Alphabetic Characters
   ++++++++++++++++++++++++++++++++++++++++++++

   The Center Of Excellence for Document Analysis and Recognition (CEDAR)
   State University of New York at Buffalo announces the availability of
   CEDAR CDROM 1: USPS Office of Advanced Technology The database contains
   handwritten words and ZIP Codes in high resolution grayscale (300 ppi
   8-bit) as well as binary handwritten digits and alphabetic characters
   (300 ppi 1-bit). This database is intended to encourage research in
   off-line handwriting recognition by providing access to handwriting
   samples digitized from envelopes in a working post office. 

        Specifications of the database include:
        +    300 ppi 8-bit grayscale handwritten words (cities,
             states, ZIP Codes)
             o    5632 city words
             o    4938 state words
             o    9454 ZIP Codes
        +    300 ppi binary handwritten characters and digits:
             o    27,837 mixed alphas  and  numerics  segmented
                  from address blocks
             o    21,179 digits segmented from ZIP Codes
        +    every image supplied with  a  manually  determined
             truth value
        +    extracted from live mail in a  working  U.S.  Post
             Office
        +    word images in the test  set  supplied  with  dic-
             tionaries  of  postal  words that simulate partial
             recognition of the corresponding ZIP Code.
        +    digit images included in test  set  that  simulate
             automatic ZIP Code segmentation.  Results on these
             data can be projected to overall ZIP Code recogni-
             tion performance.
        +    image format documentation and software included

   System requirements are a 5.25" CD-ROM drive with software to read
   ISO-9660 format. For any further information, including how to order the
   database, please contact: Jonathan J. Hull, Associate Director, CEDAR,
   226 Bell Hall State University of New York at Buffalo, Buffalo, NY 14260;
   hull@cs.buffalo.edu (email) 

6. AI-CD-ROM (see question "Other sources of information")
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

7. Time series archive
++++++++++++++++++++++

   Various datasets of time series (to be used for prediction learning
   problems) are available for anonymous ftp from ftp.santafe.edu
   [192.12.12.1] in /pub/Time-Series". Problems are for example:
   fluctuations in a far-infrared laser; Physiological data of patients with
   sleep apnea; High frequency currency exchange rate data; Intensity of a
   white dwarf star; J.S. Bachs final (unfinished) fugue from "Die Kunst der
   Fuge"

   Some of the datasets were used in a prediction contest and are described
   in detail in the book "Time series prediction: Forecasting the future and
   understanding the past", edited by Weigend/Gershenfield, Proceedings
   Volume XV in the Santa Fe Institute Studies in the Sciences of Complexity
   series of Addison Wesley (1994). 

8. USENIX Faces
+++++++++++++++

   The USENIX faces archive is a public database, accessible by ftp, that
   can be of use to people working in the fields of human face recognition,
   classification and the like. It currently contains 5592 different faces
   (taken at USENIX conferences) and is updated twice each year. The images
   are mostly 96x128 greyscale frontal images and are stored in ascii files
   in a way that makes it easy to convert them to any usual graphic format
   (GIF, PCX, PBM etc.). Source code for viewers, filters, etc. is provided.
   Each image file takes approximately 25K. 

   For further information, see 
   ftp://src.doc.ic.ac.uk/pub/packages/faces/README Do NOT do a directory
   listing in the top directory of the face archive, as it contains over
   2500 entries! 

   According to the archive administrator, Barbara L. Dijker
   (barb.dijker@labyrinth.com), there is no restriction to use them.
   However, the image files are stored in separate directories corresponding
   to the Internet site to which the person represented in the image
   belongs, with each directory containing a small number of images (two in
   the average). This makes it difficult to retrieve by ftp even a small
   part of the database, as you have to get each one individually.
   A solution, as Barbara proposed me, would be to compress the whole set of
   images (in separate files of, say, 100 images) and maintain them as a
   specific archive for research on face processing, similar to the ones
   that already exist for fingerprints and others. The whole compressed
   database would take some 30 megabytes of disk space. I encourage anyone
   willing to host this database in his/her site, available for anonymous
   ftp, to contact her for details (unfortunately I don't have the resources
   to set up such a site). 

   Please consider that UUNET has graciously provided the ftp server for the
   FaceSaver archive and may discontinue that service if it becomes a
   burden. This means that people should not download more than maybe 10
   faces at a time from uunet. 

   A last remark: each file represents a different person (except for
   isolated cases). This makes the database quite unsuitable for training
   neural networks, since for proper generalisation several instances of the
   same subject are required. However, it is still useful for use as testing
   set on a trained network. 

   ------------------------------------------------------------------------

   Next part is part 5 (of 7). Previous part is part 3. 

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
