Newsgroups: comp.ai.genetic
Path: cantaloupe.srv.cs.cmu.edu!nntp.club.cc.cmu.edu!godot.cc.duq.edu!news.duke.edu!agate!darkstar.UCSC.EDU!nic.scruz.net!earth.armory.com!jon
From: jon@armory.com (Jon Shemitz)
Subject: Stop me before I lie again!
Organization: Midnight Beach
Date: Wed, 28 Dec 1994 03:56:53 GMT
Message-ID: <D1I5Mu.B9t@armory.com>
Summary: asking for opinions on my comparison of GA to simulated annealing 
Sender: news@armory.com (Usenet News)
Nntp-Posting-Host: deepthought.armory.com
Lines: 24

Pardon the provocative title, but I'm writing an article on GA's for a 
glossy programming magazine (imho, I know just enough to get away with 
it) and at one point briefly compare them to Simulated Annealing, about 
which I know much less.  I say this:
 
> Many people have found that simulated annealing works well on a broader 
> range of problems than genetic algorithms.  While GA's can be used on 
> virtually any optimization problem, they seem to work best when the 
> problem has a certain degree of modularity so that crossover is not just
> a special sort of randomizing but rather can create a new solution which
> incorporates the best features of both parents.  SA's reliance on simply
> trying new random candidates seems to make it a bit more general.
> 
> However, just as success with physical annealing depends heavily on the 
> amount of heating and the rate of cooling, so too does success with 
> simulated annealing require careful selection of a cooling schedule.
> GA's conceptual simplicity and freedom from "complicated" mathematics 
> make them a good introduction to stochastic search,
 
Is this fair and accurate?

-- 

http://www.armory.com/~jon
