Genetic Algorithms Digest    Friday, 20 January 1989    Volume 3 : Issue 4

 - Send submissions to GA-List@AIC.NRL.NAVY.MIL
 - Send administrative requests to GA-List-Request@AIC.NRL.NAVY.MIL

Today's Topics:
	- Reminder: Call for Papers for ICGA-89
	- Test Functions, Penalty Functions, and Walsh Functions
	- Re: Constrained Optimization
	- Jordan reference
	- Proposal for function optimization suite
	- GA performance limits

--------------------------------

[ The following message is being repeated for those members who
have joined GA-List since October, and for the rest of us who need
to be reminded about deadlines. - JJG]

Date: Tue, 4 Oct 88 16:11:27 EDT
From: ds1@philabs.Philips.Com  (Dave Schaffer)
Subject: ICGA-89 Call for Papers



                              Call for Papers

         The Third International Conference on Genetic Algorithms
                                 (ICGA-89)


     The Third International Conference on Genetic Algorithms (ICGA-89),
     will be held on June 4-7, 1989 at George Mason University near
     Washington, D.C.  Authors are invited to submit papers on all
     aspects of Genetic Algorithms, including: foundations of
     genetic algorithms, search, optimization, machine learning using
     genetic algorithms, classifier systems, apportionment of credit
     algorithms, relationships to other search and learning paradigms.
     Papers discussing specific applications (e.g., OR, engineering,
     science, etc.) are encouraged.


     Important Dates:

             10 Feb 89:      Submissions must be received by program chair
             10 Mar 89:      Notification of acceptance or rejection
             10 Apr 89:      Camera ready revised versions due
             4-7 Jun 89:     Conference Dates


     Authors are requested to send four copies (hard copy only) of a
     full paper by February 10, 1989 to the program chair:


                            Dr. J. David Schaffer
                            Philips Laboratories
                            345 Scarborough Road
                            Briarcliff Manor, NY 10510
                            ds1@philabs.philips.com
                            (914) 945-6168



     Conference Committee:

     Conference Chair:       Kenneth A. De Jong, George Mason University
     Local Arrangements:     Lashon B. Booker, Naval Research Lab
     Program Chair:          J. David Schaffer, Philips Laboratories
     Program Committee:      Lashon B. Booker
                             Lawrence Davis, Bolt, Beranek and Newman, Inc.
                             Kenneth A. De Jong
                             David E. Goldberg, University of Alabama
                             John J. Grefenstette, Naval Research Lab
                             John H. Holland, University of Michigan
                             George G. Robertson, Xerox PARC
                             J. David Schaffer
                             Stephen F. Smith, Carnegie-Melon University
                             Stewart W. Wilson, Rowland Institute for Science


--------------------------------

From: Dave Goldberg <DGOLDBER@UA1VM.ua.edu>
Date: Tue, 17 Jan 89 10:23:40 CST
Subject: Test Functions, Penalty Functions, and Walsh Functions

Any test functions added to the proposed suite should be analyzed using
Walsh functions.  Some of the discussion on the GALIST regarding penalty
functions and constrained optimization should be analyzed using Walsh
functions.  I show how do both in a two-paper set entitled "Genetic
Algorithms and Walsh Functions."  "Part I, A Gentle Introduction" introduces
Walsh functions in monomial form and explains Bethke's schema average
fitness calculation with a straightforward proof.  "Part II, Deception
and Its Analysis" explains how to rigorously define deception (GA-misleading
functions) in terms of operator-adjusted Walsh coefficients.  The
methods described can be used to determine whether test functions should
be expected to cause trouble for this or that GA.  They also can be used
to rigorously identify the deception introduced by excessive penalization
of infeasible points in a constrained optimization problem.  I plan to
submit both papers to ICGA, but those who can't wait should contact me
by e-mail or write to me care of the University of Alabama, Tuscaloosa, AL
35487. Dave Goldberg (dgoldber@ua1vm.ua.edu)

--------------------------------

From: thefool@ATHENA.MIT.EDU
Subject: Re: Constrained Optimization
Date: Tue, 17 Jan 89 22:48:40 EST
    
<In  summary,  any  strategy  that  reduces the amount of
<genetic material available is likely to reduce the  effectiveness
<of  a Genetic Search Algorithm.
< Jon Richardson 

My experience has been the opposite.  My implementation of genetic
algorithms which produces production rules that predict the outcome of
horseraces was producing a lot of "mud" - evolutionary dead ends that
did not improve the accuracy of the system.  By implementing an
algorithm (based on an idea in Paul Resnick's master's thesis) that
eliminates unproductive rules at the end of each generation I was able
to increase the rate of learning of the system.

I would appreciate it if you would explain the nature of the "penalty
constants" you tested.  Do they presuppose they use of the bucket
brigade payoff algorithm?

-- Michael de la Maza		

--------------------------------

From: merrill@bucasb.BU.EDU (John Merrill)
Date:  Wed, 18 Jan 89 09:18:48 EST
Subject: Jordan reference

I have recieved a number of requests for the Tech Report that I cited
in my recent posting to GA-List (about neural systems with extra
degrees of freedom.)  I got in touch with Mike Jordan, and he said:

> You wanted a more readily available reference.  I have a substantially
> revised version of the tech report coming out in a book:  
> 
> Jordan, M. I. (1989).  Indeterminate motor skill learning problems.  In
> M. Jeannerod, (Ed.), Attention and Performance XIII.  Hillsdale, NJ: Erlbaum.
> 
> But the book isn't out yet and I'm not sure exactly when it'll appear.  The
> most readily available reference is probably "jordan@psyche.mit.edu"; I'll
> be happy to mail the paper and/or tech report to anyone who is interested.

--
John Merrill			|	ARPA:	merrill@bucasb.bu.edu
Center for Adaptive Systems	|	
111 Cummington Street		|	
Boston, Mass. 02215		|	Phone:	(617) 353-5765
--------------------------------

--------------------------------

From: Philip Resnik <presnik@BBN.COM>
Date:     Thu, 19 Jan 89 16:07:06 EST
Subject:  Proposal for function optimization suite

I'd like to suggest a problem (or perhaps a class of problems)
for the compilation of optimization problems currently underway.

"Epistasis" is the term used to describe a situation in which
a gene or group of genes influences the expression of another gene.
For example, the presence of one gene might suppress the effect of
another; alternatively, one might have a gene whose effect will not
be evident unless a group of other, prerequisite genes are present
as well.  I think epistasis might be one characteristic to look for 
in GA optimization problems.

One problem having what seems to be a minimal degree of epistasis is
the "Lock and Tumbler" problem.  The name (coined by Dave Davis) is
an apt description:  as in the sequence of numbers in a combination 
lock, no number counts until all the previous ones have been taken 
care of, in order.  

    (1) Linear Lock and Tumbler Problem

	Let X = {x1 x2 ... xL} be a bit string (i.e. xi is 0 or 1).
	Maximize F, where

		     L
		    ---
		    \	1, if for all j<=i, xj=1
	F(X) = 	    /	0, otherwise
		    ---
		    i=1

This statement of the problem has a disadvantage in that bit-dependencies
directly correspond to linear order.  A fairer statement of the problem
is the following:

    (1') Lock and Tumbler Problem (shuffled)

	 Identical to the linear Lock and Tumbler Problem, except that at the
	 outset each time the problem is run, we randomly choose a permutation 
	 of {1,2,...L} (call it P) and interpret j above as "the bit position 
	 pointed to by the jth element of P."  


One can think of many kinds of Lock and Tumbler problems.  For example,
consider the following:

    (2) Plateau Lock and Tumbler Problem

	Let X = {x1 x2 ... xL} be a bit string (i.e. xi is 0 or 1),
	where L is a multiple of some integer N.

	Divide X into groups of N bits, {g1 g2 . . . gL/N},
	and for each group g, let
			
	    F'(g) =	1, if all bits in the group are set
			0, otherwise

	and 	
			L/N
			---
			\	
	    F(X) = 	/    F'(gi)
			---
			i=1

In this problem, although there are fewer lock-and-tumbler dependencies
the N-bit plateaus provide a different kind of difficulty.  In terms 
of standardization, I would suggest always using L=36, and varying N 
from 1 to 4 (I've been referring to these as LT(1), LT(2), etc.; 
note that LT(1) is equivalent to no plateaus.)

One can also superimpose lock-and-tumbler behavior on other standard problems.
We get bitstring X' from bitstring X by following the following rule:

	x'i =   1, if for all j<=i, xj=1 
		0, otherwise

Then we use X' as the input to some other standard evaluation function;
for example, one of the functions from De Jong's suite.  

Philip Resnik
presnik@bbn.com

--------------------------------

From: Mark Hughes <mcvax!camcon.co.uk!mrh@uunet.UU.NET>
Date: Thu, 19 Jan 89 15:02:42 GMT
Subject: GA performance limits

First, thank you John for (this mailing list and) pointing me at some useful
references. I've bought Goldberg's book which is a good introduction, if not
cheap, and read a couple of papers. My enthusiasm to play with GAs is growing
rapidly and I've lots of ideas I'd like to try.

If I get round to doing so I'll share my experiences.

The reason for this post is to cast around for yet more information on other
work. On reading your paper

Grefenstette, J. J. "Optimization of Control Parameters for Genetic Algorithms"
IEEE Trans.  on Systems, Man and Cybernetics. Jan 1986

It occurred that one area I've yet to see discussion of is determining
performance limits for GAs, to help in deciding how to keep a problem within
manageable limits. 

Although problem dependent to some extent, I expect that something could be
said about the trade-offs between size of problem space and the total number
of "payoff" evaluations likely to be required for a typical problem,
for example.

Such information would be useful in assessing how useful GAs are likely to
be when applied to a particular form of a problem, and give guidance as to
how much effort to apply in limiting a problem to some reasonable "size"
before trying to implement a solver using GAs.

Does anyone know of work relevant to this aspect of GA implementation?

More generally, is there a directory of areas of prior and current work?

------------------- <mrh@camcon.co.uk>  or  <...!mcvax!ukc!idec!camcon!mrh>
|   Mark Hughes   | Tel: +44 (0) 223 420024   Cambridge Consultants Ltd. 
|(Compware . CCL) | Tlx: 931 211 0193 (KZ G)  The Science Park, Milton Road,
------------------- BT Gold: 72:MAG70076      Cambridge UK. (Own opinions etc.)

--------------------------------

End of Genetic Algorithms Digest
********************************

