Newsgroups: comp.ai.genetic,comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!portc02.blue.aol.com!howland.erols.net!netcom.com!nagle
From: nagle@netcom.com (John Nagle)
Subject: Re: GAs: breakthrough or just another optimization strategy ?
Message-ID: <nagleE1rx8t.HB@netcom.com>
Organization: NETCOM On-line Communication Services (408 261-4700 guest)
References: <57sej1$gn0$1@news.belwue.de>
Date: Mon, 2 Dec 1996 06:41:16 GMT
Lines: 15
Sender: nagle@netcom6.netcom.com
Xref: glinda.oz.cs.cmu.edu comp.ai.genetic:10528 comp.ai.neural-nets:34895

student <tis6evst@rhds01.rz.fht-esslingen.de> writes:
>I tried optimizing some functions by implementing the standard
>evolutionary strategy. That worked fine. I do not claim to have gone
>too far, but I doubt whether evolutionary approach overpreforms other,
>earlier known methods. What if we start say a couple of thousand of
>gradient descents on the function surface and make them communicate
>with each other in some way to construct an intelligent search over
>the function? This involves exactly the same amount of computational
>resources as by the use of a GA. 

      I tend to agree.  I've argued this with some of the GA proponents
at Stanford.  There's reluctance to benchmark GAs against other optimization
algorithms.  Some good benchmarks would be useful here.

					John Nagle
