Genetic Algorithms Digest    Friday, 11 March 1988    Volume 2 : Issue 6

 - Send submissions to GA-List@NRL-AIC.ARPA
 - Send administrative requests to GA-List-Request@NRL-AIC.ARPA

Today's Topics:
	- Note from Moderator
	- Ackley's SIGH and GAs (2 msgs)
	- GAs and VLSI CAD (2 msgs)
	- GAs and Neural Nets (3 msgs)
	- Responses to GA Activity Poll (2 msgs)
	- GAs and TSP
	
------------------------------

Date: Fri, 11 Mar 88 10:46:56 EST
From: John Grefenstette <GA-List-Request@NRL-AIC.ARPA>
Subject: Note from Moderator

I am happy to point out tha today's digests is the 2nd
longest issue of GA-List to date, despite the one-week
interval since the last issue!  

I will try to keep future issues shorter by increasing the
frequency as necessary.  Keep it coming!

-- JJG


------------------------------

Date: 11 Mar 1988 09:08-EST 
From: Larry.Eshelman@F.GP.CS.CMU.EDU
Subject: The GA versus SIGH

For those interested in the genetic algorithm two questions arise with
regard to David Ackley's SIGH algorithm:  (1) Is it really genetic? 
(2) How does it stack up against the traditional genetic algorithm?  I shall
address the second question.  Dave provides evidence in his dissertation (and
book) that SIGH can out perform the GA.  Unfortunately, Dave does not use
the traditional (Holland-style) GA.  (For example, his two versions use
"termination with prejudice" instead of "reproduction with emphasis" and
use an extremely high initial mutation rate.)  The following is a comparison
of the traditional GA (Grefenstette's GENESIS2) with SIGH and Dave's other
algorithms.  I have purposely used an "off the shelf" version of the GA with
the recommended parameter settings.  In other words, I have not made any
attempt to optimize the GA for Dave's functions.  Also I have used the same
measure of performance that Dave used:  the number of function evaluations
performed before the global maximum is evaluated (averaged over 50 runs).

String
Length:	                    20                        30
         ----------------------------------------    ---
         1-Max  2-Max   Trap   Porc   Plat    Mix    Mix
--------------------------------------------------------
IHC-NA      19     35   3522    ---  15055  11973 148494
IHC-SA     209    271   8808    ---  23433   5843  84380
SHC       2293   1590    ---   4528    494   1398   4688
ISA        619    840 154228   9224    814   5281   9649
SIGH       330    288    780    357   2979   3981  12477

IGS-U      369    364    ---    414    779   3664  17374 
IGS-O      409    494    ---    608    910   2305   5986

IGA-CM     339    290    ---    354    511    665   2230
IGA-C      337    310    ---    351    739    631   1929
IGA-M      257    219    ---   1163    861   2880   3359
--------------------------------------------------------

The first seven algorithms are Dave's.  The sixth and seventh are his
versions of the "GA".  The performance results for these seven algorithms are
taken from his dissertation.  At the end of this report, I briefly describe
the above functions and the algorithms for those who don't have a copy of
Dave's dissertation or book.  Here I will define the last 3 algorithms which
I call IGA-CM, IGA-C, and IGA-M.

IGA-CM is the traditional genetic algorithm (in particular, Grefenstette's
GENESIS2) applied iteratively:  Whenever there is "spinning" (more than 2
generations without an evaluation [because there are no changes]), the GA
is restarted with a new (randomly generated) population.  (See the section
on the converence ratios for an indication of how much spinning there is.)
Performance is evaluated by counting the number of trials (evaluations)
performed before the algorithm converges to the optimum value.  The trials
that lead to spinning are, of course, included in the total.  There is a
cutoff of 10,000 trials for any given starting population.

IGA-C (crossover only) is the same GA except the mutation rate is set to 0.

IGA-M (mutation only) is the same GA except the crossover rate is set to 0,
the mutation rate is set to 0.01 (ten times the normal rate), and the maximum
number of generations without an evaluation is set to 1 instead of 2.
Because of the higher rate of mutation, IGA-M is less likely to spin.  To
compensate for this, IGA-M uses a lower cutoff point.  The above results are
based on setting the maximum number of trials to 1000 for 20 bit strings and
2000 for 30 bit strings.  (A higher cutoff point would have resulted in
poorer performance.)  A final difference is that the results are based on
the average of 10 runs rather than 50.  (I tried several mutation rates for
the 2-Max function, and found that the 0.01 rate produced the best results.)

OBSERVATIONS

First of all note that the traditional GA (IGA-CM) does better than Dave's
two versions of the "GA".

Secondly, if we ignore the Trap function for the moment, it is clear that
the traditional GA out performs the other algorithms.  For the hard functions
(Porcupine, Plateaus, and Mix), it either ties with or is significantly
better than its best competitor.

But what about the Trap function?  The GA failed to converge to the optimum
for this function.  In his dissertation, Dave explains why:  "In over 98% of
the states in the [trap function] space, all uphill moves lead away from the
global maximum" [p. 83].  Dave also points out why SIGH does so well on
the Trap function:  "In effect, SIGH contains a heuristic that the other
algorithms lack:  Try the opposites of good points" [p. 84].  Dave mentions
in a footnote [p. 85] that he tried a function similar to Trap but which
didn't have this property (the global max being the opposite of the local
max), and none of the algorithms, including SIGH, succeeded except the two
simple hillclimbing algorithms.

Next, note that the iterative GA does quite well without any mutation
(IGA-C).  In fact, except for the Plateaus function, it does as well,
and sometimes better.  (In 2 runs out of 50 IGA-C got trapped in a local
maximum for the Plateaus function and this skewed its results -- otherwise
the average would have been less than 400 rather than 739)

Finally, note that the "GA" without crossover (IGA-M) and a high mutation
rate does fairly well on these functions.  It is as good as IHC-SA for the
1-Max and 2-Max, and performs similarly to the sophisticated hillclimbers
(ISA and SHC) for the hard functions.

GA CONVERGENCE RATIOS

Number of times converged to optimum / number of attempts (starting
populations).

         Plat-20   Mix-20   Mix-30
         -------  -------  -------
IGA-CM	   50/80    50/84   50/172
IGA-C      50/75    50/84   50/156
IGA-M      10/16    10/61   10/24

For the 1-Max, 2-Max, and Porcupine functions the convergence ratio
was always very close to 1.

GA PARAMETER SETTINGS

  Two point crossover
  Elitist strategy
  Population Size:    50
  Crossover Rate:     0.6
  Mutation Rate:      0.001
  Generation Gap:     1.0
  Scaling Window:     5
  Max Gens w/o Eval:  2

THE ALGORITHMS

IHC-SA	  Iterated Hill Climbing - Steepest Ascent
IHC-NA	  Iterated Hill Climbing - Next Ascent
SHC	  Stochastic Hill Climbing
ISA	  Iterated Simulated Annealing
SIGH	  Stochastic Iterated Genetic Hillclimbing
IGS-U     Iterated "Genetic" Search - Uniform combination [Ackley]
IGS-O     Iterated "Genetic" Search - Ordered combination [Ackley]
IGA-CM	  Iterated Genetic Algorithm with Crossover and Mutation
IGA-C	  Iterated Genetic Algorithm with Crossover only
IGA-M	  Iterated Genetic Algorithm with Mutation only

THE FUNCTIONS

'n' is the length of the string
'c' is the number of '1' bits in the string

One-Max:    f(x) = 10c

Two-Max:    f(x) = |18c - 8n|

Trap:		    (z-c)8n/z      if c <= z
	    f(x) =
	            (c-z)10n/(n-z) otherwise
	    where z = [(3/4)n]

Porcupine:  f(x) = 10c - 15(1-parity(c,n))
	    where parity(i,j) is 1 if i and j have the same parity and 0
	    if they have different parity.

Plateaus:   Divide the bits into four equal-sized groups.
	    For each group, if all the bits are 1, score 2.5n points,
            otherwise score 0 points.
	    Return the sum of the scores for all 4 groups.

Mix:        Divide the bits into five equal-sized groups.
	    Score each group according to one of the above function
	    (but only a single plateau for the plateaus function).
	    Return the sum of the scores for all 5 groups.

------------------------------

Date: Fri, 4 Mar 88 16:10:47 est
From: richards@UTKCS2.CS.UTK.EDU (richardson)
Subject: GA(1) or GA(2)

In issue 5, David Ackley quoted Rik Belew as follows:

ACKLEY> Rik Belew offers the following distinction (GA-List V2N4):
ACKLEY>  	GA(1) is 			   GA(2) is
ACKLEY>  "a particular (class of) 	"a broader class ... of genetic 
ACKLEY>   algorithms developed by	 algorithms (sometimes also called
ACKLEY>   John Holland and his 		 'simulated evolution') that bear
ACKLEY>   students.  This GA(1) has	 some loose resemblance to population
ACKLEY>   as its most distinctive	 genetics... Generally, these
ACKLEY>   feature the 'cross-over'	 algorithms make use of only a
ACKLEY>  operator."		   	 'mutation' operator."

He then described why he thought Belew's categorizing of the "Neural
GA" into class GA(2) was unjust.  Noteworthy was the following statement:

ACKLEY> Looking more closely at Rik's distinction, I think it turns out either
ACKLEY> that membership in GA(1) is restricted to a small and somewhat quirky
ACKLEY> "DNA-ish" subset of all possible combination rules, or that a model
ACKLEY> such as mine must be allowed to sit around the "crossover campfire",
ACKLEY> odd though its crossovers may appear to be.

Hopefully, the GA community is not so narrow-minded as to exclude anything
which is not DNA-ish cut-and-swap as a valid recombination.
The proper distinction I think is whether or not the recombination operator
in question supports the building block hypothesis.  "Mutation-like
operators" do not do this.  Any kind of weird recombination which can be
shown to propagate and construct building blocks, I would call a Genetic
Algorithm.  If the operator does nothing with building blocks, I would consider
it apocryphal.  It may be valuable but apocryphal nonetheless and shouldn't be
called a GA.

I have to admit ignorance of Ackley's work (I have heard things but only 
second hand), so I don't have an opinion of it.  I merely wanted to express an 
opinion on what distinguishes a GA, which I'll repeat:

       It's the Building block hypothesis which makes a GA, not
       whether or not the primary recombination is cross-over
       in a genetic (DNA) sense, and definitely not whether it's
       developer studied under Holland.


		   Jon Richardson.

------------------------------

Date: Tue, 8 Mar 88 15:59:51 CST
From: kurt@sneezy.csg.uiuc.edu (Kurt)
Subject: GAs and VLSI CAD

Does anyone know of any references concerning the application of 
genetic algorithms to VLSI CAD problems (i.e., routing, placement, 
PLA folding, etc.)?  I would appreciate any help that could be 
provided.  Thanks in advance.

Kurt Thearling
Coordinated Science Laboratory
University of Illinois
1101 W. Springfield Ave
Urbana, IL  61801

kurt@bach.csg.uiuc.edu or kurt%bach@uxc.cso.uiuc.edu

[ The following papers are relevant:

"Compaction of symbolic layout using genetic algorithms" by Mike Fourman,
and "Bin packing with adaptive search", by Derek Smith, both in the
Proceedings of 1st ICGATA, 1985.

"Applying adaptive search to epistatic domains", by L. Dave Davis, 
IJCAI, 1985.

-- JJG ]

------------------------------

Date: Thu, 10 Mar 88  11:59:59 CST
From: Derek Smith <DSMITH@vdle5.csc.ti.com>
Subject: GAs and VLSI

John, please add me to GA-List.

APPLICATION AREA: GAs for optimization problems for IC design.

GENERAL APPROACH: Permutation crossover analysis and experiment.

GA TOOL: Home grown, TI Explorer.

RESULTS: TSP one quarter % of optimal, 30 cities, reported in
         2nd GA conference (Oliver, Smith, Holland).

                                Derek.

------------------------------

Date: Wed, 9 Mar 88 14:47:40 PST
From: kamen@cod.nosc.mil (Roxana B. Kamen)
Subject: GAs and Neural Nets

	I am just now starting to look at the possibility of 
using ideas from genetic algorithms research in neural network designs.
There were a few papers addressing ways to do this at last years Neural 
Networks Conference (ICNN).

       Roxana Kamen (kamen@nosc.mil)

------------------------------

Date: Wed, 9 Mar 88 10:43:13 pst
From: Mike Anderson <anderson@BOEING.COM>
Subject: GAs and Neural Nets

 Please add my name to the GA-List.  I would be interested in receiving
copies of recent issues.  I am not at present working on genetic 
algorithms, although I am interested in the approach and consider it
as a possibility when I am faced with a new problem. 
At present I am trying to initiate a research program in neural nets.
I sometimes fantasize about applications combining neural nets and
genetic search strategies in a system that can extend its arena of
competence by adapting knowledge to new problems.

Michael Anderson
Boeing Computer Services
ATC  M/S 7L-64
P.O. Box 24346
Seattle, WA  98124-0346
anderson@boeing.com

------------------------------

Date: Wed, 9 Mar 88 12:43:18 CST
From: lugowski%resbld.csc.ti.com@RELAY.CS.NET
Subject: GAs and Neural Nets

Please add me to the GA-list.  Although I am not presently 
working with GAs, I may do so soon.

				-- Marek Lugowski
				   lugowski@resbld.csc.ti.com
                                   Texas Instruments
                                   Neural Networks Project
                                   P.O. Box 655936, M/s 154
                                   Dallas, Texas 75265

------------------------------

Date: Sat, 5 Dec 87 14:02:51 PST
From: jmg@CS.UCLA.EDU (James M Goodwin)
Subject: Response to Research Activity Poll

	[Apologies for delayed posting -- JJG]

APPLICATION AREA: 
	Combinatorial optimization, 
	Assignment of functionality in programmable logic nets
	cognitive modeling with machine applications
	Boltzmann machine generalizations

GENERAL APPROACH: 
	searching a Boolean problem space 

GA TOOL: 
	Nothing yet. Still trying to accumulate information.
RESULTS: 
	Nothing yet. Still trying to accumulate information.
PROBLEMS: 
	representation, credit assignment


-- Jim Goodwin
   Distributed Machine Intelligence Laboratory
   Department of Computer Science
   3531 Boelter Hall, UCLA
   Los Angeles, CA 90024 
   jmg@cs.ucla.edu

------------------------------

Date: Wed, 9 Mar 88 16:27:25 est
From: Jim Cohoon <cohoon@cssun1.cs.virginia.edu>
Subject: Response to Research Activity Poll

Add me to the list please.

Application Areas: Combinatorial Optimization in general,
	VLSI applications in particular

General Approach: Directed evolution

Results: Parallel Genetic Algorithm Paradigm, VLSI Module Placement,
	VLSI Partitioning Algorithm, Simulated Annealing Comparison

------------------------------

Date:     Thu, 10 Mar 88 10:22:24 MET
From: "J. A. 'Biep' Durieux" <mcvax!cs.vu.nl!biep@uunet.UU.NET>
Subject:  GAs and TSP

I am not currently doing any real GA-research, but I am building a (robot)
reasoning and learning system, and plan to implement GA as one of the
ways to get new knowledge. The set-up is to be somewhat EURISKO-like, but
more rigorous. Besides that, GA just attract me.

A thought:

Besides finding the best value-array, a GA also finds (by way of the
inversion operator) "locally best" orderings on that array (i.e. best,
given the current contents: good values together, and bad values together).
This seems to me an activity very much like the travelling salesman
problem (the array is thought circular). This suggests better inversion
operators.
The problem of what a cross-over operator should do with different
orderings is exactly the problem of how the TSP should be attacked with
a GA. Is there a "clean" solution to that, by now?

						J. A. Durieux
						biep@cs.vu.nl

[ There have been some interesting attempts at the TSP with GAs.
See papers by Grefenstette et al., and by Goldberg & Lingle in ICGATA 85,
papers by Suh & Van Gucht, by Jog & Van Gucht, and by Oliver, Smith & Holland
in ICGATA 87, and by Grefenstette in _Genetic Algorithms and Simulated Annealing_,
L. Davis (ed.).

-- JJG ]

------------------------------

End of Genetic Algorithms Digest
********************


