Newsgroups: comp.ai.genetic
Path: cantaloupe.srv.cs.cmu.edu!europa.chnt.gtegsc.com!gatech!news.mathworks.com!news.kei.com!simtel!harbinger.cc.monash.edu.au!yarrina.connect.com.au!labtam!labtam!chris
From: chris@labtam.labtam.oz.au (Chris Taylor)
Subject: Re: A cute landscape for playing with GA's
Message-ID: <chris.804740755@labtam>
Organization: Labtam Australia Pty. Ltd., Melbourne, Australia
References: <1995Jun22.061433.25317@labtam.labtam.oz.au> <3sbn71$jss@tabloid.amoco.com> <chris.803866333@labtam> <3t1425$oaf@ixnews3.ix.netcom.com>
Date: Mon, 3 Jul 1995 03:05:55 GMT
Lines: 22

JEThomas@ix.netcom.com (Jonah Thomas) writes:

>...much deleted

>I've put about 8 human-hours into this, and may have missed something
>important.  Of course my idea of clustering mutations was arbitrary,
>and for some fitness functions would be worse than useless.  This is a
>low level of genetic learning.  How do you get GAs to learn how to
>learn more effectively?  How can they observe patterns in fitness
>functions and exploit them?

In a basic sense they do.
The patterns are observed by selection and crossover exploits them
(or attempts to).

Naturally the observation and exploitation are somewhat stocastic
and so won't home in on a feature as reliably as a specialized 'hunter'.

But the GA operations do tend (over time) to try aspects of various other
strategies, and so provide a simple alternative to developing a series of
specialized hunters.

