
Genetic Algorithms Digest   Friday, March 20 1992   Volume 6 : Issue 11

 - Send submissions to GA-List@AIC.NRL.NAVY.MIL
 - Send administrative requests to GA-List-Request@AIC.NRL.NAVY.MIL
 - anonymous ftp archive: FTP.AIC.NRL.NAVY.MIL (see v6n5 for details)

Today's Topics:
	- LS-1 style classifier?
	- Request for information on paper
	- Info wanted on Evolution Machine
	- software survey - addendum (Evolution Machine)
	- GAucsd 1.2 now in GA-List archive
	- preprints available

**********************************************************************

CALENDAR OF GA-RELATED ACTIVITIES: (with GA-List issue reference)

 Canadian AI Conference, Vancouver,                           May 11-15, 1992
 COGANN, Combinations of GAs and NNs, @ IJCNN-92 (v5n31)      Jun 6,     1992
 ARTIFICIAL LIFE III, Santa Fe, NM                            Jun 15-19, 1992
 Evolution as a computational process, Monterey (v6n9)        Jun 22-24, 1992
 ML-92, Machine Learning Conference, Aberdeen (v6n8)          Jul  1-3,  1992
 10th National Conference on AI, San Jose,                    Jul 12-17, 1992
 FOGA-92, Foundations of Genetic Algorithms, Colorado (v5n32) Jul 26-29, 1992
 COG SCI 92, Cognitive Science Conference, Indiana, (v5n39)   Jul 29-1,  1992
 ECAI 92, 10th European Conference on AI (v5n13)              Aug  3-7,  1992
 Parallel Problem Solving from Nature, Brussels, (v5n29)      Sep 28-30, 1992
 SAB92, From Animals to Animats, Honolulu (v6n6)              Dec  7-11, 1992

 (Send announcements of other activities to GA-List@aic.nrl.navy.mil)

**********************************************************************
----------------------------------------------------------------------

From: Patrick Vye <pvye@u.washington.edu>
Date: Fri, 6 Mar 92 11:44:47 -0800
Subject: LS-1 style classifier?

   Hello all . . . 

   I have two requests.  

   (1)  Does anyone have an LS-1 style classifier that
   is available to experiment with? (even something that is remotely
   related is fine)  

   (2)  Has anyone recently published anything that
   provides a vision for future classifier research?

   Thank you very much.

   -- Pat Vye

------------------------------

From: "Shang-Hong Lai" <hong@reef.cis.ufl.edu>
Date: Fri, 06 Mar 92 14:42:14 EST
Subject: Request for information on paper


      I am interested in looking at the paper below.

      A.E. Nix and M.D. Vose
      " Modeling Genetic Algorithms with Markov Chains "

      If anyone know where can I find it or how to get it,
      please let me know.

      Thanks for your help.

   - Shang-hong Lai
     hong@mosquito.cis.ufl.edu

------------------------------

From: Art Corcoran <corcoran@penguin.mcs.utulsa.edu>
Date: Wed, 11 Mar 1992 08:26:45 -0600
Subject: Info wanted on Evolution Machine

   I have obtained the documentation and source for the Evolution 
   Machine from the ftp site in Germany.  Unfortunately, the source 
   is in a passworded archive.

   I have sent several email messages to the contacts with no reply.

   Is anyone out there using the Evolution Machine?  Do you know the
   archive password (or how to get the contact to reply)?  Is it worth
   using?

   Thanks,
   Art Corcoran
   University of Tulsa

[Editor's Note: See next message! --Alan]

------------------------------

From: schraudo@cs.UCSD.EDU (Nici Schraudolph)
Date: Wed, 18 Mar 92 16:18:06 PST
Subject: software survey - addendum (Evolution Machine)

   The author/contact address for the "Evolution Machine" GA software has
   changed recently.  The new address is:

	      Hans-Michael Voigt             Joachim Born
   Internet:  voigt@mike.fb10.tu-berlin.de   born@max.fb10.tu-berlin.de

   Address:   Technische Universitaet Berlin
	      Fachgebiet Bionik und Evolutionstechnik
	      Forschungsgruppe Bio- und Neuroinformatik, Sekr. ACK1
	      Ackerstrasse 71-76
	      1000 Berlin 65
	      Germany                        Phone: +49-30-314-72-677

   The file pub/GAucsd/GAsoft.txt available by anonymous ftp from
   cs.ucsd.edu has been updated accordingly.

   Best regards,

   - Nici Schraudolph.

------------------------------

From: schraudo@cs.UCSD.EDU (Nici Schraudolph)
Date: Thu, 12 Mar 92 13:47:26 PST
Subject: GAucsd 1.2 now in GA-List archive

   The contents of the pub/GAucsd directory on cs.ucsd.edu -- material
   related to the GAucsd GA software -- are now also available in the file
   pub/galist/source-code/ga-source/ga-ucsd12.tar for anonymous ftp from the
   GA-List archive server ftp.aic.nrl.navy.mil.

   GAucsd is a GENESIS-based GA package incorporating numerous bug fixes
   and user interface improvements.  Major additions include a wrapper
   that simplifies the writing of evaluation functions, a facility to
   distribute experiments over networks of machines, and Dynamic Parameter
   Encoding, a technique that improves GA performance in continuous search
   spaces by adaptively refining the genomic representation of real-valued
   parameters.

   GAucsd was written in C for Unix systems, but the central GA engine is
   easily ported to other platforms.  The entire package can be ported to
   systems where implementations of the Unix utilities "make", "awk" and
   "sh" are available.

   - Nici Schraudolph.

------------------------------

From: DMONTANA@cooper.bbn.com
Date: Mon, 16 Mar 1992 12:22 EDT
Subject: preprints available

  The following three papers are scheduled to be published next month.
  Requests for preprints can be addressed to dmontana@bbn.com.

  From NIPS-4:

	  "A Weighted Probabilistic Neural Network"

  Abstract: The Probabilistic Neural Network (PNN) algorithm represents
  the likelihood function of a given class as the sum of identical,
  isotropic Gaussians.  In practice, PNN is often an excellent pattern
  classifier, outperforming other classifiers including backpropagation.
  However, it is not robust with respect to affine transformations of
  feature space, and this can lead to poor performance on certain data.
  We have derived an extension of PNN called Weighted PNN (WPNN) which
  compensates for this flaw by allowing anisotropic Gaussians, i.e.
  Gaussians whose covariance is not a multiple of the identity matrix.
  The covariance is optimized using a genetic algorithm, some
  interesting features of which are its redundant, logarithmic encoding
  and large population size.  Experimental results validate our claims.


  From SPIE Conference on Learning and Adaptive Systems:

	  "Genetic Search of a Generalized Hough Transform Space"

  Abstract: We use a Generalized Hough transform (GHT) to detect and
  track instances of a class of sonar signals.  This class consists of a
  four-dimensional set of curves and hence requires a four-dimensional
  transform space for the GHT.  Many of the signals we need to detect
  are very weak.  Such signals yield peaks in the transform space which
  are both very narrow and not too far above the random background
  variations.  Finding such peaks is difficult.  Exhaustive search over
  a predetermined discretization of the transform space will yield a
  nearly optimal point for a sufficiently fine discretization.  However,
  even with an intelligently chosen discretization, exhaustive search
  requires searching over (and hence evaluating) many points in the
  transform space.  We have therefore developed a genetic algorithm to
  more efficiently search the transform space.  Designing the genetic
  algorithm to work properly has required experimentation with a number
  of its parameters.  The most important of these are (i) the
  representation, (ii) the population size, and (iii) the number of
  runs.


  From SPIE Conference on Learning and Adaptive Systems:

  "Genetic Optimization of the Parameters of a Track-While-Detect Algorithm"

  Abstract: We have developed an algorithm to detect the presence of
  narrowband signals and track the time evolution of their center
  frequencies.  This algorithm has 35 parameters whose optimal values
  depend on (among other things): (1) the expected dynamics of the
  signals, (ii) the background statistics, and (iii) the clutter (i.e.,
  the number of simultaneous signals).  Manually optimizing these
  parameters is a difficult task not only because of the large number of
  parameters but also because of the interdependence of their effects on
  performance.  We have therefore devised an automated method for
  optimizing the parameters.  It has three basic components: (i) a
  "truth" database with a graphical interface for easy manual entry of
  "truth", (ii) a scoring function which is a linear combination of six
  subscores (three evaluating detection performance and three evaluating
  tracking performance), and (iii) a distributed genetic algorithm which
  optimizes the parameter values for a particular truth database.  We
  have used this procedure to optimize the parameter values to a variety
  of signal types and environmental conditions.  The results have been
  improved performance as well as the ability to make the algorithm
  adaptive: as the system detects changes in the environmental
  conditions, it can switch to a different set of parameters.

  David Montana
  dmontana@bbn.com

------------------------------
End of Genetic Algorithms Digest
******************************
