Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!gatech!news.mathworks.com!usenet.eel.ufl.edu!spool.mu.edu!howland.erols.net!cs.utexas.edu!math.ohio-state.edu!jussieu.fr!rain.fr!news.sprintlink.net!news-dc-9.sprintlink.net!interpath!news.interpath.net!sas!newshost.unx.sas.com!hotellng.unx.sas.com!saswss
From: saswss@unx.sas.com (Warren Sarle)
Subject: comp.ai.neural-nets FAQ: weekly reminder
Archive-Name: ai-faq/neural-nets
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <nn_remind.posting_842839201@hotellng.unx.sas.com>
Supersedes: <nn_remind.posting_842234400@hotellng.unx.sas.com>
Date: Mon, 16 Sep 1996 02:00:02 GMT
Expires: Mon, 30 Sep 1996 02:00:01 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
Last-Modified: 1996-08-10
Reply-To: saswss@unx.sas.com (Warren Sarle)
Organization: SAS Institute Inc., Cary, NC, USA
Url: ftp://ftp.sas.com/pub/neural/FAQ.html
Keywords: FAQ, answers, monthly posting, reference, pointer
Followup-To: comp.ai.neural-nets
Lines: 116

This is a reminder for the existence of seven monthly postings to the
Usenet newsgroup comp.ai.neural-nets. The postings are called:
   comp.ai.neural-nets FAQ, Part 1 of 7: Introduction
   comp.ai.neural-nets FAQ, Part 2 of 7: Learning
   comp.ai.neural-nets FAQ, Part 3 of 7: Generalization
   comp.ai.neural-nets FAQ, Part 4 of 7: Books, data, etc.
   comp.ai.neural-nets FAQ, Part 5 of 7: Free software
   comp.ai.neural-nets FAQ, Part 6 of 7: Commercial software
   comp.ai.neural-nets FAQ, Part 7 of 7: Hardware

FAQ stands for 'Frequently Asked Questions'.  Its purpose is to provide
basic information for individuals who are new to the field of neural
networks or are just beginning to read this group. It shall help to
avoid lengthy discussion of questions that usually arise for beginners
of one or the other kind.

>>>>> SO, PLEASE, SEARCH THE FAQ POSTING FIRST IF YOU HAVE A QUESTION <<<<<

                           AND

>>>>> DON'T POST ANSWERS TO FAQs: POINT ASKERS TO THE FAQ POSTING <<<<<

The latest version of the FAQ is available as a hypertext document,
readable by any WWW (World Wide Web) browser such as Mosaic, under the
URL "ftp://ftp.sas.com/pub/neural/FAQ.html".  This version is updated
more frequently than the archived copies.

The FAQ posting departs to comp.ai.neural-nets on the 28th of every
month.  It is also sent to the groups comp.answers and news.answers
where it should be available at any time (ask your news manager). This
reminder is posted every Sunday.

The FAQ posting is archived in the periodic posting archive on host
rtfm.mit.edu (and on some other hosts as well).  Look in the anonymous
ftp directory "/pub/usenet/news.answers/ai-faq/neural-nets".  The
filenames are "part1", "part2", ... "part7".  If you do not have
anonymous ftp access, you can access the archive by mail server as well.
Send an E-mail message to mail-server@rtfm.mit.edu with "help" and
"index" in the body on separate lines for more information.

What's new since the last month's FAQ posting:
Part 5: Free software
--- PMNEURO 1.0a

Part 6: Commercial software
--- PREVia

The following questions are answered in the FAQ posting:

============================== Questions ==============================

Part 1: Introduction
--- What is this newsgroup for?  How shall it be used?
--- Where is comp.ai.neural-nets archived?
--- What is a neural network (NN)?
--- What can you do with an NN and what not?
--- Who is concerned with NNs?
--- How are layers counted?
--- How are NNs related to statistical methods?

Part 2: Learning
--- How many learning methods for NNs exist?  Which?
--- What is backprop?
--- What are conjugate gradients, Levenberg-Marquardt, etc.?
--- Why use a bias input?
--- Why use activation functions?
--- What is a softmax activation function?
--- What is the curse of dimensionality?
--- How do MLPs compare with RBFs?
--- What are OLS and subset regression?
--- Should I normalize/standardize/rescale the data?
--- Should I nonlinearly transform the data?
--- What is ART?
--- What is PNN?
--- What is GRNN?
--- What does unsupervised learning learn?
--- What about Genetic Algorithms and Evolutionary Computation?
--- What about Fuzzy Logic?

Part 3: Generalization
--- How is generalization possible?
--- How does noise affect generalization?
--- What is overfitting and how can I avoid it?
--- What is jitter? (Training with noise)
--- What is early stopping?
--- What is weight decay?
--- hat is Bayesian estimation?
--- How many hidden layers should I use?
--- How many hidden units should I use?
--- How can generalization error be estimated?
--- What are cross-validation and bootstrapping?

Part 4: Books, data, etc.
--- Good literature about Neural Networks?
--- Any journals and magazines about Neural Networks?
--- The most important conferences concerned with Neural Networks?
--- Neural Network Associations?
--- Other sources of information about NNs?
--- Databases for experimentation with NNs?

Part 5: Free software
--- Freely available software packages for NN simulation?

Part 6: Commercial software
--- Commercial software packages for NN simulation?

Part 7: Hardware, etc.
--- Neural Network hardware?
--- Unanswered FAQs

========================================================================
-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
