Newsgroups: comp.ai.genetic
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news4.ner.bbnplanet.net!news3.near.net!paperboy.wellfleet.com!news-feed-1.peachnet.edu!usenet.eel.ufl.edu!gatech!howland.reston.ans.net!vixen.cso.uiuc.edu!usenet.ucs.indiana.edu!news.cs.indiana.edu!mmeiss@avocado.ucs.indiana.edu
From: "mark r meiss" <mmeiss@avocado.ucs.indiana.edu>
Subject: Genetic training of neural networks
Message-ID: <1995Jul12.142004.19296@news.cs.indiana.edu>
Organization: Computer Science, Indiana University
Date: Wed, 12 Jul 1995 14:19:53 -0500
Lines: 46

Newsgroups: comp.ai.genetics
Subject: Genetic training of neural networks
Summary: 
Followup-To: 
Distribution: 
Organization: Computer Science, Indiana University
Keywords: 
Cc: 

I thought I ought to share information about some work I've recently done 
in training neural networks with a GA.

I'll describe the code; if anyone is interested, I will either e-mail the 
source or post it, depending on the amount of interest.

The purpose of the program was to train a three-layer feed-forward neural 
network (using a log sigmoid transfer function) with a genetic method.  
The network in question can be trained well with backpropagation; 
however, the purpose was to determine whether or not a certain trait of 
the network was a consequence of backpropagation. [It wasn't].

The program maintains a population of 20 networks.  On each iteration, 
all of the networks are mutated by a function that takes the size of 
mutation and probability of mutating any one weight as inputs.  The 
fitness of the networks is then determined by calculating the sum-squared 
error for a set of training data.  The top 10 networks go on to the next 
generation unchanged; the bottom 10 have their weights and biases 
scrambled with those of the top 10.

Despite the conventional wisdom prohibiting using real values as if they 
were simple genes, this approach worked.  After about 4000-5000 
iterations, the network was trained as well as it would be after about 
500 iterations of the MATLAB neural network toolbox's trainbpx function 
(backpropagation with variable learning rate and momentum).

The program can also write MATLAB scripts for simulation of the networks.

If anyone is interested, the program is written in plain vanilla C++ (the 
only possible complication is a call to qsort() if your library doesn't 
have it) and totals ~15-20k.

Feedback of all sorts welcome.

Mark Meiss (mmeiss@indiana.edu)
Indiana University CS Dept.

