Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!nntp.club.cc.cmu.edu!goldenapple.srv.cs.cmu.edu!das-news2.harvard.edu!oitnews.harvard.edu!news.dfci.harvard.edu!camelot.ccs.neu.edu!news.mathworks.com!cam-news-hub1.bbnplanet.com!cpk-news-hub1.bbnplanet.com!news.bbnplanet.com!paladin.american.edu!zombie.ncsc.mil!newsgate.duke.edu!interpath!news.interpath.net!news.interpath.net!sas!newshost.unx.sas.com!hotellng.unx.sas.com!saswss
From: saswss@unx.sas.com (Warren Sarle)
Subject: comp.ai.neural-nets FAQ: weekly reminder
Archive-Name: ai-faq/neural-nets
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <nn_remind.posting_859777203@hotellng.unx.sas.com>
Supersedes: <nn_remind.posting_859172401@hotellng.unx.sas.com>
Date: Mon, 31 Mar 1997 03:00:04 GMT
Expires: Mon, 14 Apr 1997 03:00:03 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
Last-Modified: 1997-03-19
Reply-To: saswss@unx.sas.com (Warren Sarle)
Organization: SAS Institute Inc., Cary, NC, USA
Url: ftp://ftp.sas.com/pub/neural/FAQ.html
Keywords: FAQ, answers, monthly posting, reference, pointer
Followup-To: comp.ai.neural-nets
Lines: 127

This is a reminder for the existence of seven monthly postings to the
Usenet newsgroup comp.ai.neural-nets. The postings are called:
   comp.ai.neural-nets FAQ, Part 1 of 7: Introduction
   comp.ai.neural-nets FAQ, Part 2 of 7: Learning
   comp.ai.neural-nets FAQ, Part 3 of 7: Generalization
   comp.ai.neural-nets FAQ, Part 4 of 7: Books, data, etc.
   comp.ai.neural-nets FAQ, Part 5 of 7: Free software
   comp.ai.neural-nets FAQ, Part 6 of 7: Commercial software
   comp.ai.neural-nets FAQ, Part 7 of 7: Hardware

FAQ stands for 'Frequently Asked Questions'.  Its purpose is to provide
basic information for individuals who are new to the field of neural
networks or are just beginning to read this group. It shall help to
avoid lengthy discussion of questions that usually arise for beginners
of one or the other kind.

>>>>> SO, PLEASE, SEARCH THE FAQ POSTING FIRST IF YOU HAVE A QUESTION <<<<<

                           AND

>>>>> DON'T POST ANSWERS TO FAQs: POINT ASKERS TO THE FAQ POSTING <<<<<

The latest version of the FAQ is available as a hypertext document,
readable by any WWW (World Wide Web) browser such as Mosaic, under the
URL "ftp://ftp.sas.com/pub/neural/FAQ.html".  This version is updated
more frequently than the archived copies.

The FAQ posting departs to comp.ai.neural-nets on the 28th of every
month.  It is also sent to the groups comp.answers and news.answers
where it should be available at any time (ask your news manager).
The FAQ posting, like any other posting, may a take a few days
to find its way over Usenet to your site. Such delays are especially 
common outside of North America. 

The FAQ posting is archived in the periodic posting archive on host
rtfm.mit.edu (and on some other hosts as well).  Look in the anonymous
ftp directory "/pub/usenet/news.answers/ai-faq/neural-nets".  The
filenames are "part1", "part2", ... "part7".  If you do not have
anonymous ftp access, you can access the archive by mail server as well.
Send an E-mail message to mail-server@rtfm.mit.edu with "help" and
"index" in the body on separate lines for more information.

This reminder is posted every Sunday.

What's new since the last month's FAQ posting:

Part 1: Introduction
--- What are cases and variables?
--- What are the population, sample, training set,
    design set, validation set, and test set?

The following questions are answered in the FAQ posting:

============================== Questions ==============================

Part 1: Introduction
--- What is this newsgroup for?  How shall it be used?
--- Where is comp.ai.neural-nets archived?
--- May I copy this FAQ?
--- What is a neural network (NN)?
--- What can you do with an NN and what not?
--- Who is concerned with NNs?
--- How are layers counted?
--- What are cases and variables?
--- What are the population, sample, training set,
    design set, validation set, and test set?
--- How are NNs related to statistical methods?

Part 2: Learning
--- How many learning methods for NNs exist?  Which?
--- What is backprop?
--- What are conjugate gradients, Levenberg-Marquardt, etc.?
--- Why use a bias input?
--- Why use activation functions?
--- What is a softmax activation function?
--- What is the curse of dimensionality?
--- How do MLPs compare with RBFs?
--- What are OLS and subset regression?
--- Should I normalize/standardize/rescale the data?
--- Should I nonlinearly transform the data?
--- How to measure importance of inputs?
--- What is ART?
--- What is PNN?
--- What is GRNN?
--- What does unsupervised learning learn?
--- What about Genetic Algorithms and Evolutionary Computation?
--- What about Fuzzy Logic?

Part 3: Generalization
--- How is generalization possible?
--- How does noise affect generalization?
--- What is overfitting and how can I avoid it?
--- What is jitter? (Training with noise)
--- What is early stopping?
--- What is weight decay?
--- What is Bayesian learning?
--- How many hidden layers should I use?
--- How many hidden units should I use?
--- How can generalization error be estimated?
--- What are cross-validation and bootstrapping?

Part 4: Books, data, etc.
--- Books and articles about Neural Networks?
--- Any journals and magazines about Neural Networks?
--- The most important conferences concerned with Neural Networks?
--- Neural Network Associations?
--- Other sources of information about NNs?
--- Databases for experimentation with NNs?

Part 5: Free software
--- Freeware and shareware packages for NN simulation?

Part 6: Commercial software
--- Commercial software packages for NN simulation?

Part 7: Hardware, etc.
--- Neural Network hardware?
--- Unanswered FAQs

========================================================================
-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
 *** Do not send me unsolicited commercial or political email! ***

