Newsgroups: comp.ai.neural-nets,comp.ai.genetic
Path: cantaloupe.srv.cs.cmu.edu!rochester!cornellcs!newsstand.cit.cornell.edu!news.kei.com!newsfeed.internetmci.com!in2.uu.net!DIALix!sydney.DIALix.oz.au!quasar!telford
From: telford@threetek.dialix.oz.au (Telford Tendys)
Subject: Re: calculating the fitness of an ANN
In-Reply-To: "LENGERS R.J.C."'s message of 23 Jan 1996 19:52:11 GMT
Message-ID: <1996Feb1.021900.15464@threetek.dialix.oz.au>
Organization: 3Tek Systems Pty Ltd., N.S.W., Australia (does not endorse this posting)
References: <4d69ng$f0m@mailnews.kub.nl> <x7KFPI8.predictor@delphi.com> <4do7i9$2p9@kaserv.gni.net> <4dofs4$2jn@nfw.bear.com> <4e3e9b$7i4@mailnews.kub.nl>
Date: Thu, 1 Feb 1996 02:19:00 GMT
Lines: 50
Xref: glinda.oz.cs.cmu.edu comp.ai.neural-nets:29616 comp.ai.genetic:7869

[ long leadup and explanation of combining GA with NN ]
[ has been removed from here ]
> 
> Any suggestions or comments anyone, I greatly appreciate your input.
> 
> Roeland J.C. Lengers
> R.J.C.Lengers@kub.nl
> 
From the ``Artificial Life II'' book, the concept of
co-evolution of host and parasite is probably what you
want. I can get you the details of the article if you ask
but I don't have it with me at the moment. The gist of
it (as applied to your problem) is:

* Take all 100 samples and put them into a sample pool.

* Make the parasitic DNA define which 20 samples from the
 pool are used as test samples (the remaining 80 being
 training samples). Thus, an example parasite DNA might be
 { 2, 4, 5, 15, 19, 20, 23, 29, 40, 51, 56, 58, 61, 65, 76, 78, 86, 87, 88, 93 }
 Thus all samples from the training pool with an index that
 is in the DNA set will be reserved as ``test'' samples leaving
 80 ``training'' samples.

* Make the NN DNA be the training algorithm used including
  ``learning'' constants, repetition times and so on (this must
  be deterministic so NN DNA must include seeds for any random
  numbers that are part of the learning process).

* Pair up a NN with a parasite by training the NN with the
  80 training samples that the parasite has chosen and
  then testing the trained NN on the 20 test samples.
  The NN scores high if it does well on the test samples,
  the parasite scores high if the NN performs badly on the
  test samples.

* Evolution wise, NNs compete with other NNs for the highest
  score at the same time as parasites compete with other
  parasites for the highest score.

You need to fill in some details (obviously) but the general
idea is to evolve both the filter algorithm that you are trying
to build and the best way to test that filter algorithm to
see if it is the one that you want. In my example, you will
evolve not only the best neural network weightings for the
given test data but also a training algorithm that is known
to generate networks with a good capability for generalisation
(even when the test data is chosen to be as difficult as possible).

	- Tel
