Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!cornellcs!newsstand.cit.cornell.edu!news.acsu.buffalo.edu!news.uoregon.edu!newsfeed.orst.edu!newshub.tc.umn.edu!mr.net!www.nntp.primenet.com!nntp.primenet.com!feed1.news.erols.com!worldnet.att.net!news.mathworks.com!newsgate.duke.edu!interpath!news.interpath.net!news.interpath.net!sas!newshost.unx.sas.com!hotellng.unx.sas.com!saswss
From: saswss@unx.sas.com (Warren Sarle)
Subject: changes to "comp.ai.neural-nets FAQ" -- monthly posting
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <nn.changes.posting_854510445@hotellng.unx.sas.com>
Supersedes: <nn.changes.posting_851832034@hotellng.unx.sas.com>
Date: Wed, 29 Jan 1997 04:00:46 GMT
Expires: Wed, 5 Mar 1997 04:00:45 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
Reply-To: saswss@unx.sas.com (Warren Sarle)
Organization: SAS Institute Inc., Cary, NC, USA
Keywords: modifications, new, additions, deletions
Followup-To: comp.ai.neural-nets
Lines: 412

==> nn1.changes.body <==
*** nn1.oldbody	Sat Dec 28 23:00:10 1996
--- nn1.body	Tue Jan 28 23:00:15 1997
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part1
! Last-modified: 1996-12-13
  URL: ftp://ftp.sas.com/pub/neural/FAQ.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part1
! Last-modified: 1997-01-07
  URL: ftp://ftp.sas.com/pub/neural/FAQ.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 173,176 ****
--- 173,179 ----
     question mark) 
  
+    Students: please do not ask comp.ai.neural-net readers to do your
+    homework or take-home exams for you. 
+ 
  3. Answers
  ++++++++++
***************
*** 266,271 ****
  ================================================
  
! Two archives are available for comp.ai.neural-nets: 
  
   o ftp://ftp.cs.cmu.edu/user/ai/pubs/news/comp.ai.neural-nets 
   o http://asknpac.npac.syr.edu 
--- 269,275 ----
  ================================================
  
! The following archives are available for comp.ai.neural-nets: 
  
+  o Deja News 
   o ftp://ftp.cs.cmu.edu/user/ai/pubs/news/comp.ai.neural-nets 
   o http://asknpac.npac.syr.edu 

==> nn2.changes.body <==
*** nn2.oldbody	Sat Dec 28 23:00:16 1996
--- nn2.body	Tue Jan 28 23:00:21 1997
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part2
! Last-modified: 1996-12-23
  URL: ftp://ftp.sas.com/pub/neural/FAQ2.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part2
! Last-modified: 1997-01-24
  URL: ftp://ftp.sas.com/pub/neural/FAQ2.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 25,28 ****
--- 25,32 ----
     What are OLS and subset regression?
     Should I normalize/standardize/rescale the data?
+       Should I standardize the input variables?
+       Should I standardize the target variables?
+       Should I standardize the variables for unsupervised learning?
+       Should I standardize the input cases?
     Should I nonlinearly transform the data?
     How to measure importance of inputs?
***************
*** 1540,1543 ****
--- 1544,1593 ----
  use an identity output activation function or other unbounded output
  activation function instead; see Why use activation functions? 
+ 
+ Subquestion: Should I standardize the variables (column vectors)
+ ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+ for unsupervised learning?
+ ++++++++++++++++++++++++++
+ 
+ The most commonly used methods of unsupervised learning, including various
+ kinds of vector quantization, Kohonen networks, Hebbian learning, etc.,
+ depend on Euclidean distances or scalar-product similarity measures. The
+ considerations are therefore the same as for standardizing inputs in RBF
+ networks--see Should I standardize the input variables (column vectors)?
+ above. 
+ 
+ If you are using unsupervised competitive learning to try to discover
+ natural clusters in the data, rather than for data compression, simply
+ standardizing the variables may be inadequate. For more sophisticated
+ methods of preprocessing, see: 
+ 
+    Art, D., Gnanadesikan, R., and Kettenring, R. (1982), "Data-based Metrics
+    for Cluster Analysis," Utilitas Mathematica, 21A, 75-99. 
+ 
+    Jannsen, P., Marron, J.S., Veraverbeke, N, and Sarle, W.S. (1995), "Scale
+    measures for bandwidth selection", J. of Nonparametric Statistics, 5,
+    359-380. 
+ 
+ Better yet for finding natural clusters, try mixture models or nonparametric
+ density estimation. For example:: 
+ 
+    Girman, C.J. (1994), "Cluster Analysis and Classification Tree
+    Methodology as an Aid to Improve Understanding of Benign Prostatic
+    Hyperplasia," Ph.D. thesis, Chapel Hill, NC: Department of Biostatistics,
+    University of North Carolina. 
+ 
+    McLachlan, G.J. and Basford, K.E. (1988), Mixture Models, New York:
+    Marcel Dekker, Inc. 
+ 
+    SAS Institute Inc. (1993), SAS/STAT Software: The MODECLUS Procedure, SAS
+    Technical Report P-256, Cary, NC: SAS Institute Inc. 
+ 
+    Titterington, D.M., Smith, A.F.M., and Makov, U.E. (1985), Statistical
+    Analysis of Finite Mixture Distributions, New York: John Wiley & Sons,
+    Inc. 
+ 
+    Wong, M.A. and Lane, T. (1983), "A kth Nearest Neighbor Clustering
+    Procedure," Journal of the Royal Statistical Society, Series B, 45,
+    362-368. 
  
  Subquestion: Should I standardize the input cases (row vectors)?

==> nn3.changes.body <==

==> nn4.changes.body <==
*** nn4.oldbody	Sat Dec 28 23:00:24 1996
--- nn4.body	Tue Jan 28 23:00:30 1997
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part4
! Last-modified: 1996-12-18
  URL: ftp://ftp.sas.com/pub/neural/FAQ4.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part4
! Last-modified: 1996-12-30
  URL: ftp://ftp.sas.com/pub/neural/FAQ4.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 1213,1217 ****
     (London), http://rtm.science.unitn.it/ Reactive Memory Search (Tabu
     Search) page (Trento, Italy), http://www.wi.leidenuniv.nl/art/ (ART WWW
!    site, Leiden, Netherlands), 
     Many others are available too; WWW is changing all the time. 
  
--- 1213,1218 ----
     (London), http://rtm.science.unitn.it/ Reactive Memory Search (Tabu
     Search) page (Trento, Italy), http://www.wi.leidenuniv.nl/art/ (ART WWW
!    site, Leiden, Netherlands), http://nucleus.hut.fi/nnrc/ Helsinki
!    University of Technology. 
     Many others are available too; WWW is changing all the time. 
  

==> nn5.changes.body <==
*** nn5.oldbody	Sat Dec 28 23:00:27 1996
--- nn5.body	Tue Jan 28 23:00:34 1997
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part5
! Last-modified: 1996-11-07
  URL: ftp://ftp.sas.com/pub/neural/FAQ5.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part5
! Last-modified: 1997-01-13
  URL: ftp://ftp.sas.com/pub/neural/FAQ5.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 83,86 ****
--- 83,87 ----
  34. NNDT 
  35. Trajan 2.0 Shareware 
+ 36. Neural Networks at your Fingertips 
  
  See also http://www.emsl.pnl.gov:2080/docs/cie/neural/systems/shareware.html
***************
*** 336,345 ****
     of the Helsinki University of Technology, Laboratory of Computer and
     Information Science, Rakentajanaukio 2 C, SF-02150 Espoo, FINLAND There
!    are versions for Unix and MS-DOS available from cochlea.hut.fi
!    [130.233.168.48] as /pub/lvq_pak/lvq_pak-2.1.tar.Z (340 kB, Unix sources),
!    /pub/lvq_pak/lvq_p2r1.exe (310 kB, MS-DOS self-extract archive), 
!    /pub/som_pak/som_pak-1.2.tar.Z (251 kB, Unix sources), 
!    /pub/som_pak/som_p1r2.exe (215 kB, MS-DOS self-extract archive). (further
!    programs to be used with SOM_PAK and LVQ_PAK can be found in /pub/utils).
  
  19. Nevada Backpropagation (NevProp)
--- 337,342 ----
     of the Helsinki University of Technology, Laboratory of Computer and
     Information Science, Rakentajanaukio 2 C, SF-02150 Espoo, FINLAND There
!    are versions for Unix and MS-DOS available from 
!    http://nucleus.hut.fi/nnrc/nnrc-programs.html 
  
  19. Nevada Backpropagation (NevProp)
***************
*** 790,793 ****
--- 787,821 ----
     andrew@trajan-software.demon.co.uk for more
     details. 
+ 
+ 36. Neural Networks at your Fingertips
+ ++++++++++++++++++++++++++++++++++++++
+ 
+    "Neural Networks at your Fingertips" is a
+    package of ready-to-reuse neural network
+    simulation source code which was prepared for
+    educational purposes by Karsten Kutza. The
+    package consists of eight programs, each of
+    which implements a particular network
+    architecture together with an embedded example
+    application from a typical application domain.
+    Supported network architectures are 
+     o Adaline, 
+     o Backpropagation, 
+     o Hopfield Model, 
+     o Bidirectional Associative Memory, 
+     o Boltzmann Machine, 
+     o Counterpropagation, 
+     o Self-Organizing Map, and 
+     o Adaptive Resonance Theory. 
+    The applications demonstrate use of the networks
+    in various domains such as pattern recognition,
+    time-series forecasting, associative memory,
+    optimization, vision, and control and include
+    e.g. a sunspot prediction, the traveling
+    salesman problem, and a pole balancer.
+    The programs are coded in portable,
+    self-contained ANSI C and can be obtained from
+    the web pages at 
+    http://www.geocities.com/CapeCanaveral/1624. 
  
  ------------------------------------------------------------------------

==> nn6.changes.body <==
*** nn6.oldbody	Sat Dec 28 23:00:31 1996
--- nn6.body	Tue Jan 28 23:00:38 1997
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part6
! Last-modified: 1996-12-12
  URL: ftp://ftp.sas.com/pub/neural/FAQ6.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part6
! Last-modified: 1997-01-13
  URL: ftp://ftp.sas.com/pub/neural/FAQ6.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 80,83 ****
--- 80,84 ----
  31. Neural Bench 
  32. Trajan 2.0 Neural Network Simulator 
+ 33. DataEngine 
  
  See also http://www.emsl.pnl.gov:2080/docs/cie/neural/systems/software.html 
***************
*** 1495,1498 ****
--- 1496,1529 ----
     There is also a shareware version of the Software available; please
     download this to check whether Trajan 2.0 fulfils your needs. 
+ 
+ 33. DataEngine
+ ++++++++++++++
+ 
+       Name: DataEngine, DataEngine ADL, DataEngine V.i
+ 
+    Company: MIT GmbH
+    Address: Promenade 9
+             52076 Aachen
+             Germany
+ 
+      Phone: +49 2408 94580
+        Fax: +49 2408 94582
+      EMail: mailto:info@mitgmbh.de
+        URL: http://www.mitgmbh.de
+ 
+    DataEngine is a software tool for data analysis implementing
+    Fuzzy Rule Based Systems, Fuzzy Cluster Methods, Neural Networks,
+    and Neural-Fuzzy Systems in combination with conventional methods
+    of mathematics, statistics, and signal processing.
+ 
+    DataEngine ADL enables you to integrate classifiers or controllers
+    developed with DataEngine into your own software environment.  It
+    is offered as a DLL for MS/Windows or as a C++ library for various
+    platforms and compilers.
+ 
+    DataEngine V.i is an add-on tool for LabView (TM) that enables you
+    to integrate Fuzzy Logic and Neural Networks into LabView through
+    virtual instruments to build systems for data analysis as well as
+    for Fuzzy Control tasks.
  
  ------------------------------------------------------------------------

==> nn7.changes.body <==
*** nn7.oldbody	Sat Dec 28 23:00:33 1996
--- nn7.body	Tue Jan 28 23:00:42 1997
***************
*** 1,4 ****
  Archive-name: ai-faq/neural-nets/part7
! Last-modified: 1996-05-27
  URL: ftp://ftp.sas.com/pub/neural/FAQ7.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
--- 1,4 ----
  Archive-name: ai-faq/neural-nets/part7
! Last-modified: 1997-01-18
  URL: ftp://ftp.sas.com/pub/neural/FAQ7.html
  Maintainer: saswss@unx.sas.com (Warren S. Sarle)
***************
*** 27,30 ****
--- 27,43 ----
  =================================
  
+ Thomas Lindblad notes on 96-12-30: 
+ 
+    The reactive tabu search alogortm has been implemented by the
+    Italians, in Trento. ISA and VME and soon PCI boards are available.
+    We tested the system with the IRIS and SATIMAGE data and it did
+    better than most other chips. 
+ 
+    The Neuroclassifier is available from Holland still and is also the
+    fastest nnw chip or a transient time less than 100 ns. 
+ 
+    JPL is making another chip, ARL in WDC is making another, so there
+    are a few things going on ... 
+ 
  Overview articles: 
  
***************
*** 149,182 ****
        was one of the co-designers of this chip.
  
! 9. IC Tech
! ++++++++++
  
!    NEURO-COMPUTING IC's:
!    *  DANN050L (dendro-dendritic artificial neural network)
!       + 50 neurons fully connected at the input
!       + on-chip digital learning capability
!       + 6 billion connections/sec peak speed
!       + learns 7 x 7 template in < 50 nsec., recalls in < 400 nsec.
!       + low power < 100 milli Watts
!       + 64-pin package
!    *  NCA717D  (neuro correlator array)
!       + analog template matching in < 500 nsec.
!       + analog input / digital output pins for real-time computation
!       + vision applications in stereo and motion computation
!       + 40-pin package
!    NEURO COMPUTING BOARD:
!    *  ICT1050
!       + IBM PC compatible or higher
!       + with on-board DANN050L
!       + digital interface
!       + custom configurations available
!    Contact:
!    IC Tech (Innovative Computing Technologies, Inc.)
!    4138 Luff Court
!    Okemos, MI 48864
!    (517) 349-4544
!    ictech@mcimail.com
  
! And here is an incomplete overview over known Neural Computers with their
  newest known reference.
  
--- 162,196 ----
        was one of the co-designers of this chip.
  
! 9. IC Tech, Inc.
! ++++++++++++++++
  
!     *  NRAM (Neural Retrieve Associative Memory)   is available as a stand-alone chip or
!        a functional unit which can be embedded inside another chip, e.g., a digital signal
!        processor or SRAM.  Data storage procedure is compatible with conventional 
!        memories, i.e., a single presentation of the data is sufficient. Set-up and hold 
!        times are comparable with existing devices of similar technology dimensions. 
!        Data retrieval capability is where NRAM excels. When addressed, this content 
!        addressable memory produces the one previously-stored pattern that matches the 
!        presented data sequence most closely.  If no matching pattern is found, no data is 
!        returned.   This set of error-correction and smart retrieval tasks are accomplished without 
!        comparators, processors, or other external logic.  Number of data bits is adjustable.
!        Optimized circuitry consumes little power.  Many applications of NRAM exist in rapid 
!        search of large databases, template matching, and associative recall.
! 
!      *  NRAM (neural retrieve associate memory) development environment includes PC card
!         with on board NRAM chip and C++ source code to address the device.
! 
!        
!         Contact:
! 
!         IC Tech, Inc.
!         2157 University Park Dr.
!         Okemos, MI 48864
!         (517) 349-4544
!         (517) 349-2559  (FAX)
!         http://www.ic-tech.com
!         ictech@ic-tech.com
  
! And here is an incomplete overview of known Neural Computers with their
  newest known reference.
  
***************
*** 439,442 ****
--- 453,457 ----
   o Luke Koops <koops@gaul.csd.uwo.ca> 
   o Kurt Hornik <Kurt.Hornik@tuwien.ac.at> 
+  o Thomas Lindblad <lindblad@kth.se> 
   o Clark Lindsey <lindsey@particle.kth.se> 
   o William Mackeown <mackeown@compsci.bristol.ac.uk> 
-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
 *** Do not send me unsolicited commercial or political email! ***

