
Genetic Algorithms Digest   Tuesday, October 6 1992   Volume 6 : Issue 34

 - Send submissions to GA-List@AIC.NRL.NAVY.MIL
 - Send administrative requests to GA-List-Request@AIC.NRL.NAVY.MIL
 - anonymous ftp archive: FTP.AIC.NRL.NAVY.MIL (Info in /pub/galist/FTP)

Today's Topics:
	- What is the mutation rate?
	- GENESIS on the CM-200 (Re: (v6n31) parallel GA code; want info)
	- GAs and Poly Sci
	- Re: (v6n31) Request for references on GAs and Economics
	- ICGA-93 length, a gentle protest
	- Question about "Schemeta"
	- Availability of IlliGAL Reports
	- COLT 93 CALL FOR PAPERS
	- ECML93 Workshop Call

****************************************************************************

CALENDAR OF GA-RELATED ACTIVITIES: (with GA-List issue reference)

SAB92, From Animals to Animats, Honolulu (v6n6)                 Dec 07-11, 92
ICNN93, IEEE Intl. Conf. on Neural Networks, Calif (v6n24)      Mar 28-01, 93
ECML-93, European Conf. on Machine Learning, Vienna (v6n26)	Apr 05-07, 93
Foundations of Evolutionary Computation WS, Vienna (v6n34)      Apr     8, 93
Intl. Conf. on Neural Networks and GAs, Innsbruck (v6n22)       Apr 13-16, 93
ICGA-93, Fifth Intl. Conf. on GAs, Urbana-Champaign (v6n29)     Jul 17-22, 93
COLT93, ACM Conf on Computational Learning Theory, UCSC (v6n34) Jul 26-28, 93

(Send announcements of other activities to GA-List@aic.nrl.navy.mil)

****************************************************************************
----------------------------------------------------------------------

From: ds1@philabs.Philips.Com  (Dave Schaffer)
Date: Thu, 24 Sep 92 15:53:24 EDT
Subject: What is the mutation rate?

   What is the mutation rate?

   It seems there are two models of mutation in common usage in the GA
   community for empirical work and there may not be wide recognition of the
   fact.  This leads to confusion when comparing published results.

   Model 1:
   Mutation rate is the probability of CHANGING the allele at each locus.
   Thus Pm = 1.0 complements every string (for binary representations). I.e.,
   it is not random.  To get a random string, set Pm = 0.5.

   Model 2:
   Mutation rate is the probability NOT COPYING the allele at each locus; the
   resulting allele is chosen at random from the set of possible values.
   Thus, Pm = 1.0 gives random strings (for any cardinality representation).

   Goldberg's simple GA (SGA) uses the former, while Grefenstette's GENESIS
   uses the latter.  The mutation rate you specify to Model 2 is twice the
   rate you specify to Model 1 to get the same behavior (for binary).

   At a minimum, future publications that cite mutation rates should specify
   which one they mean. In our early work, we used code derived from GENESIS,
   so mutation rates in our previous publications are Model 2 rates. The CHC
   divergence rates are Model 1 rates.

   It seems intuitive that mutation should refer to the likelihood of actual
   change, yet it also seems intuitive that 100% mutation should be
   associated with random individuals.  Perhaps the community would like to
   discuss the pros and cons of each definition.

   Dave Schaffer

------------------------------

From: vanlent@cs.utk.edu
Date: Fri, 25 Sep 92 11:13:38 -0400
Subject: GENESIS on the CM-200 (Re: (v6n31) parallel GA code; want info)

	   In reponse to the request for parallel genetic algorithm 
   implementations in GA-List v6n31.  

	   I spent this past summer at the Naval Research Lab working with
   Ken Dejong and John Grefenstette to implement John Grefenstette's
   GENESIS on the CM-200 in C*.  The result, which I've been calling
   PARAGENESIS, is an attempt to improve performance as much as possible
   without changing the behavior of the genetic algorithm.  Unlike the
   punctuated equilibria and local selection models PARAGENESIS doesn't
   modify the genetic algorithm to be more parallelizable as these
   modifications can drastically alter the behavior of the algorithm.
   Instead each member is placed on a seperate processor allowing
   initialization, evaluation and mutation to be completely parallel.  The
   costs of global control and communication in selection and crossover
   are present but minimized as much as possible.  In general PARAGENESIS
   on an 8k CM-200 seems to run 10-100 times faster than GENESIS on a
   Sparc 2 and finds equivalent solutions.  The solutions are not
   identical only because the parallel random number generator gives a
   different stream of numbers.

	   PARAGENESIS includes all the features of serial GENESIS plus
   some additions.  The additions include the ability to collect timing
   statistics, probabilistic selection(as opposed to Baker's stochastic
   universal sampling), uniform crossover and local or neighborhood
   selection.  Anyone familiar with the serial implementation of GENESIS
   and C* should have little problem using PARAGENESIS.

	   I have tried PARAGENESIS on the CM5 with a beta version of C*
   and it did not work after a simple recompilation.  However I believe
   that the problem is probably due to some gaps in the beta version of C*
   and can easily be fixed by slight modifications to PARAGENESIS.

	   If you are interested in a copy of PARAGENESIS send me a note
   at vanlent@cs.utk.edu(I'm back to the lowly life of a grad student).
   If there is a big response I'll talk to Alan Schultz and see if we can
   make PARAGENESIS available through anonymous ftp.

   DISCLAIMER: PARAGENESIS is fairly untested at this point and may
   contain some bugs.  I will try to fix any reported bugs as my schedule
   and my access to the CM allows.

   Michael van Lent
   Computer Science Dept.
   University of Tennessee
   Knoxville TN 37996-1301
   vanlent@cs.utk.edu

------------------------------

From: scottmwn@casbah.acns.nwu.edu (Scott Page)
Date: Fri, 25 Sep 92 13:31:05 CDT
Subject: GAs and Poly Sci

   Recently there has been mention of work in economics and genetic
   algorithms in the GA Digest.  John Miller, Ken Kollman and I have been
   applying GAs to political science as well.  Our first paper "Adaptive
   Parties and Spatial Elections" appears in the December issue of the
   American Political Science Review.  In the paper we use genetic
   algorithms to adapt political platforms. Also, Thad Brown at the
   university of Missouri is doing some interesting work on political life
   on a lattice.

   I think he also has some genetic algorithm models but I am not sure.

   Scott e Page. scottmwn@casbah.acns.nwu.edu[D

------------------------------

From: Robert Marks <bobm@mummy.agsm.unsw.OZ.AU>
Date: Tue, 29 Sep 1992 17:15:42 +1000
Subject: Re: (v6n31) Request for references on GAs and Economics

   In response to a request from Bernard Maderick for some references to
   GA/CfSys work in the area of economics, Rick Riolo (v6n33) mentioned
   two papers:

	 "Money as a Medium of Exchange in an Economy with Artificially
	  Intelligent agents."  Marimon, McGrattan and Sargent,
	  Santa Fe Institute working paper 89-004.
	  (request at email@sfi.santafe.edu I think) 

	  "Artificial Adaptive Agents in Economic Theory"
	   Holland and Miller, Amer.Econ.Rev, 81 (1991).

   The first is now published: Journal of Economic Dynamics and Control, vol.
   14, pp. 329-373, 1990.  Two more are:

	   "Breeding Hybrid Strategies: Optimal Behaviour for Oligopolists"
	   Robert E. Marks, J. of Evolutionary Economics, vol 2. pp. 17-38,
	   1992.

	   Eaton B C, Slade M E,
	   Evolutionary Equilibrium In Market Supergames,
	   mimeo., November 1989.
   Bob

   Robert MARKS, Visiting Professor, Graduate School of Business, Stanford
		 University, Stanford, CA 94305, USA
   Phones:     (415) 725-7144 (W),  (415) 854-8115 (H), (415) 725-7979 (Fax)
   Internet:   bobm@agsm.unsw.oz.au
	       r.marks@unsw.edu.au
	       FMARKS@gsb-peso.stanford.edu
   BITNET: bobm%mummy.agsm.unsw.oz.au@uunet.uu.net
   or:	mummy.agsm.unsw.oz.au!bobm@uunet.uu.net

------------------------------

From: Rick.Riolo@um.cc.umich.edu
Date: Mon, 28 Sep 92 06:20:54 EDT
Subject: ICGA-93 length, a gentle protest

   A gentle protest...

   Are the published dates for the ICGA-93 correct, i.e., 17-22 July?
   That is *six* days!  I think that's too long---it will be a real
   strain, probably impossible, for me to stay for six days.  I suspect it
   will be a strain for others as well.  Or is it part of the plan to
   order topics so that people need not stay the entire time?
     - r

------------------------------

From: hjsong@camars.kaist.ac.kr (Song Hyojeong)
Date: Tue, 22 Sep 92 17:09:25 KST
Subject: Question about "Schemeta"

   Why is concept of schemeta so important?

   Is its consequence just from that it can prove phenomenon wherein short,
   low-order, above average-fitness schemeta(building block) increase
   exponentially as generations go ( "Schemeta Theorem " of Holland)
   therefore GA is possible to find (near) optimal solution by "Building
   Block Hypothesis"?

   Then What is "Implicit Parallelism" in Schemeta Theory?  Does it impliy
   just that schemeta is processed according to "Schemeta Theorem "
   implicitly and in parallel as long as simple genetic operators are applied
   to a population of string explicitly in each generation?

   Is it impossible that Schemeta concept can help to direct search during
   processing genetic algorithm?

   If anyone who have answer to these questions,
     PLZ give me answer. 

      from Song Hyojeong 
       Computer Architecture Lab.
	 Computer Science Department 
	   KAIST  Korea

     e-mail address  : hjsong@camars.kaist.ac.kr

------------------------------

From: deb@gal1.ge.uiuc.edu (Kalyanmoy Deb)
Date: Fri, 25 Sep 92 09:48:27 -0500
Subject: Availability of IlliGAL Reports

   Illinois Genetic Algorithms Laboratory is pleased to announce the
   availibility of the following technical reports. Copies of them can be
   obtained by contacting

   Monica Heckert
   Illinois Genetic Algorithms Laboratory
   308A Transportation Building
   104 South Mathews Avenue
   Urbana, IL 61801

   e-mail: heckert@gal1.ge.uiuc.edu
   phone:  (217)333-2346 (9AM to 5PM CT, M-F)


   IlliGAL Report No 92001

   Title: Sufficient Conditions for Deceptive and Easy Binary Functions
   Authors: Kalyanmoy Deb and David E. Goldberg

   This paper finds sufficient conditions for fully or partially deceptive
   binary functions by calculating schema average fitness values.
   Deception conditions are first derived for functions of unitation and
   then extended for any binary function. The analysis is also extended to
   find a set of sufficient conditions for fully easy binary functions.
   It is found that the computational complexity required to investigate
   full or partial deception in a problem of size $\ell$ using these
   sufficient conditions is $O(2^{\ell})$ and using all necessary
   conditions of deception is $O(4^{\ell})$. This suggests that these
   sufficient conditions can be used to quickly test deception in a
   function. Furthermore, it is found that these conditions may also be
   systematically used to design a fully deceptive function by performing
   only $O(\ell^2)$ comparisons and to design a partially deceptive
   function to order $k$ by performing only $O(k\ell)$ comparisons.  The
   analysis shows that in the class of functions of unitation satisfying
   these conditions of deception, an order-$k$ partially deceptive
   function is also partially deceptive to any lower order. Finally, these
   sufficient conditions are used to investigate deception in a number of
   currently-used deceptive problems.


   IlliGAL Report No 92002

   Title: Parallel Recombinative Simulated Annealing: A Genetic Algorithm 
   Authors: Samir W. Mahfoud and David E. Goldberg

   This paper introduces, analyzes, and utilizes a parallel method of
   simulated annealing. Borrowing from genetic algorithms, an effective
   combination of simulated annealing and genetic algorithms is developed.
   This new algorithm, called {\em parallel recombinative simulated
   annealing}, retains the desirable asymptotic convergence properties of
   simulated annealing, while adding the populations approach and
   recombinative power of genetic algorithms. By varying the population
   size and the number of iterations per temperature, one can obtain an
   optimization procedure which more closely resembles either simulated
   annealing or the genetic algorithm. Parallel recombinative simulated
   annealing is presented, and a global convergence proof is given. The
   algorithm is run repeatedly on two deceptive problems to demonstrate
   the added implicit parallelism which results from larger population
   sizes.


   IlliGAL Report No 92003

   Title: Multimodal Deceptive Functions
   Authors: Kalyanmoy Deb, Jeffrey Horn, and David E. Goldberg

   This paper presents a static analysis of deception in multimodal
   functions.  Deception in a bipolar function of unitation (a function
   with two global optima and a number of deceptive attractors) is defined
   and a set of sufficient conditions relating function values is
   obtained.  A bipolar deceptive function is also constructed from
   low-order Walsh coefficients. Multimodal functions of bounded deception
   are formed by concatenating a number of bipolar deceptive functions
   together.  These functions offer a great challenge to global
   optimization algorithms including genetic algorithms because they are
   deceptive and have a large number of attractors, of which only a few
   are global optima. These functions are also useful to study because
   they open doors for generalizing the notion of deception and allow us
   to better understand the importance of deception in the study of
   genetic algorithms.


   IlliGAL Report No 92004

   Title: Crowding and Preselection Revisited
   Authors: Samir W. Mahfoud

   This paper considers the related algorithms, crowding and preselection,
   as potential multimodal function optimizers. It examines the ability of
   the two algorithms to preserve diversity, especially multimodal
   diversity. Crowding is analyzed in terms of the number of replacement
   errors it makes. Different strategies for reducing or eliminating error
   are proposed and examined. Finally, a variation of preselection is
   presented which approximates crowding, virtually eliminates replacement
   error, and restores selection pressure.


   IlliGAL Report No 92005

   Title: Massive Multimodality, Deception, and Genetic Algorithms}
   Authors: David E. Goldberg, Kalyanmoy Deb, and Jeffrey Horn

   This paper considers the use of genetic algorithms (GAs) for the
   solution of problems that are both average-sense misleading (deceptive)
   and massively multimodal. An archetypical multimodal-deceptive problem,
   here called a {\it bipolar deceptive problem}, is defined and two
   generalized constructions of such problems are reviewed, one using
   reflected trap functions and one using low-order Walsh coefficients;
   sufficient conditions for bipolar deception are also reviewed. The
   Walsh construction is then used to form a 30-bit, order-six
   bipolar-deceptive function by concatenating five, six-bit bipolar
   functions. This test function, with over five million local optima and
   32 global optima, poses a difficult challenge to simple and niched GAs
   alike. Nonetheless, simulations show that a simple GA can reliably find
   one of the 32 global optima if appropriate signal-to-noise-ratio
   population sizing is adopted. Simulations also demonstrate that a
   niched GA can reliably and simultaneously find all 32 global solutions
   if the population is roughly sized for the expected niche distribution
   and if the function is appropriately scaled to emphasize global
   solutions at the expense of suboptimal ones. These results immediately
   recommend the application of niched GAs using appropriate population
   sizing and scaling. They also suggest a number of avenues for
   generalizing the notion of deception.


   IlliGAL Report No 92006

   Title: Ordering Genetic Algorithms and Deception
   Authors: Hillol Kargupta, Kalyanmoy Deb, and David E. Goldberg

   This paper considers deception in the context of ordering genetic
   algorithms (GAs). Order-four deceptive ordering problems are designed
   by calculating schema fitness values, and an {\it absolute} and a {\it
   relative} ordering decoding are introduced.  Three different crossover
   operators are used in both absolute and relative ordering problems, and
   for each combination of crossover operator and coding, the schema
   survival probability is calculated. Simulation results show that no
   single crossover operator is adequate to find the globally optimal
   solution in both absolute and relative ordering problems. As expected
   from fundamental GA theory, the success of a genetic algorithm depends
   on how well the crossover operator respects the underlying coding of
   the problem.

------------------------------

From: Lenny Pitt <pitt@pitt.cs.uiuc.edu>
Date: Mon, 21 Sep 92 13:55:51 EDT
Subject: COLT 93 CALL FOR PAPERS



                           CALL FOR PAPERS

                              COLT '93

             Sixth ACM Conference on Computational Learning Theory

          University of California at Santa Cruz July 26-28,  1993

   The sixth conference on Computational Learning Theory will be held July
   26-28, 1992, at the University of California, Santa Cruz, California.
   The conference is sponsored jointly by the ACM Special Interest Groups
   in Automata and Computability Theory and Artificial Intelligence.
   Registration is open, within the limits of the space available.  We
   invite papers in all areas that relate directly to the analysis of
   learning algorithms and the theory of machine learning, including
   artificial and biological neural networks, robotics, pattern
   recognition, inductive inference, information theory, decision theory,
   Bayesian/MDL estimation, and cryptography. We look forward to a lively,
   interdisciplinary meeting.  As part of our program, we are pleased to
   present two invited talks, one by John Grefenstette on genetic
   algorithms, and one by Geoffrey Hinton on neural networks.


   ABSTRACT SUBMISSION

   Authors should submit fourteen copies (preferably two-sided copies) 
   of an extended abstract to be received by February 9, 1993, to 

			Lenny Pitt 
			COLT '93, Room 2120 DCL
			Department of Computer Science
			University of Illinois at Urbana-Champaign
			1304 W. Springfield Ave. 
			Urbana, Illinois 61801 

   The abstract should consist of 

   -  A cover page with title, authors' names, (postal and e-mail) addresses, 
      and a 200 word summary.  

   -  A body not longer than 10 pages with roughly 35 lines/page in 12-point
      font.  Papers deviating significantly from this length constraint will
      not be considered.  The body should include a clear definition of the
      theoretical model used, an overview of the results, and some discussion
      of their significance, including comparison to other work. Proofs or
      proof sketches should be included in the technical section.
      Experimental results are welcome, but are expected to be supported by
      theoretical analysis.

   An abstract must be *received* by February 9, 1993 (or postmarked 
   January 30 and sent airmail, or sent overnight delivery on February 8).  
   This deadline is FIRM.  Papers that have appeared in journals or
   other conferences, or that are being submitted to other conferences,
   are not appropriate for submission to COLT.

   PROGRAM FORMAT 

   Depending on submissions, and in order to accomodate a broad variety of
   papers, the final program may consist of both "long" talks, and "short"
   talks, corresponding to longer and shorter papers in the proceedings.
   The short talks might also be coupled with a poster presentation in a
   special poster session.  All papers will be considered for both
   categories.  Authors who do not want their papers considered for the
   short category should indicate that fact in the cover letter.  The
   cover letter should also specify the contact author.


   NOTIFICATION

   Authors will be notified of acceptance or rejection by a letter mailed 
   on or before April 9, with possible earlier notification via email.  
   Final camera-ready papers will be due on May 14.


   CONFERENCE CHAIR     David Helmbold 
			(UC Santa Cruz, email to colt93@cse.ucsc.edu).

   LOCAL ARRANGEMENTS   David Haussler 
			(UC Santa Cruz, email to colt93@cse.ucsc.edu).

   PROGRAM CHAIR        Lenny Pitt 
			(U. Illinois, Urbana, email to colt93@cs.uiuc.edu).


   PROGRAM COMMITTEE

   Dana Angluin (Yale), 
   Wray Buntine (RIACS/Nasa Ames Rsrch. Ctr.),
   Bob Daley (U. Pittsburgh), 
   Sally Goldman (Washington U. St. Louis), 
   Ming Li (U. Waterloo), 
   Yishay Mansour (Tel Aviv U. and IBM TJ Watson Research Center), 
   Lenny Pitt (U. Illinois), 
   Ron Rivest (MIT), 
   Sebastian Seung (AT&T Bell Labs), 
   Takeshi Shinohara (Kyushu Inst. Tech.), 
   Eduardo Sontag (Rutgers), 
   Rolf Wiehagen  (Humboldt U. and U. Kaiserslautern),
   Kenji Yamanishi (NEC Research Institute).

------------------------------

From: spears@AIC.NRL.Navy.Mil
Date: Thu, 1 Oct 92 12:38:38 EDT
Subject: ECML93 Workshop Call


			     CALL FOR PAPERS
	 Workshop on ``Foundations of Evolutionary Computation''
			 To be held after ECML93
		 Thursday April 8, 1993   Vienna, Austria

	Evolutionary computation refers to the simulated  evolution  of
   structures  based on their performance in an environment.  A variety
   of evolutionary computation approaches have emerged in the last  few
   decades, including "evolutionary programming" (Fogel, 1966), "evolu-
   tion strategies" (Rechenberg, 1973), "genetic algorithms"  (Holland,
   1975), and "genetic programming" (de Garis, 1990; Koza, 1990).

	The goal of this workshop is to focus on the more general topic
   of  evolutionary  computation  and  to draw researchers from diverse
   areas to discuss its foundations.  The topic of this workshop  is  a
   unifying theme for researchers working in the different evolutionary
   computation approaches.  It will also  be  of  interest  to  related
   research   communities,  such  as  artificial  life.   The  workshop
   encourages papers on the following topics:

    -  Theories of evolutionary computation.  The theories should  con-
       trast and compare different evolutionary computation approaches,
       such as genetic algorithms, evolution  strategies,  evolutionary
       programming, and genetic programming.

    -  Comparisons of different evolutionary computation approaches  on
       machine  learning  tasks.   The  comparisons  may be theoretical
       and/or experimental.

   Please send 4 hard copies of a  paper  (10-15  double-spaced  pages,
   ECML-93  format)  or  (if  you  do  not  wish  to present a paper) a
   description of your current research to:

      Commanding Officer
      Naval Research Laboratory
      Code 5510, Attn: William M. Spears
      4555 Overlook Avenue, SW
      Washington, DC  20375-5320

   Email submissions to spears@aic.nrl.navy.mil  are  also  acceptable,
   but  they  must  be  in  PostScript.   FAX  submissions  will not be
   accepted.  If you have any questions about the workshop, please send
   email  to  William M. Spears at spears@aic.nrl.navy.mil or call 202-
   767-9006.

   Important Dates (all deadlines will be strict):

      January 11 - Papers and research descriptions due
      February 1 - Acceptance notification
      February 22 - Final version of papers due

   Program Committee:

      William M. Spears, Naval Research Laboratory (USA, chair)
      Kenneth A. De Jong, George Mason University (USA, co-chair)
      Gilles Venturini, Universite de Paris-Sud (France, co-chair)
      Diana F. Gordon, Naval Research Laboratory (USA)
      David Fogel, ORINCON Corporation (USA)
      Hugo de Garis, Electro Technical Lab (Japan)
      Thomas Baeck, University of Dortmund (Germany)

------------------------------
End of Genetic Algorithms Digest
******************************
