Newsgroups: comp.ai.neural-nets,comp.ai.genetic,sci.stat.math
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!oitnews.harvard.edu!purdue!haven.umd.edu!news.umbc.edu!eff!news.duke.edu!news.mathworks.com!newsfeed.internetmci.com!in1.uu.net!munnari.OZ.AU!news.uwa.edu.au!DIALix!brisbane.DIALix.oz.au!sydney.DIALix.oz.au!quasar!telford
From: telford@threetek.dialix.oz.au (Telford Tendys)
Subject: Re: Experiments with GA and neural nets
In-Reply-To: Archmage@mageton.demon.co.uk's message of Fri, 29 Dec 1995 17:16:48 GMT
Message-ID: <1996Jan2.015516.9336@threetek.dialix.oz.au>
Organization: 3Tek Systems Pty Ltd., N.S.W., Australia
Date: Tue, 2 Jan 1996 01:55:16 GMT
Lines: 58
Xref: glinda.oz.cs.cmu.edu comp.ai.neural-nets:28964 comp.ai.genetic:7590 sci.stat.math:8591

> From: Archmage@mageton.demon.co.uk (Archmage)
> 
> In message <4bqs6u$rf@cloner2.ix.netcom.com> Jive Dadson  wrote:
> 
> > 
> > I thought you might be interested in a little informal experimenting
> > I have been doing with a genetic alogorithm ("GA") for evolving neural
> > networks.
> > 
> 
> Would it be more useful to design a code to produce neural nets? The
> code would have to have no syntax, or else recombining it could produce
> "recipies" which would crash the program.
> 
> What I see would be a load of different bit-strings, which are used to
> generate neural nets, by reading them as "instructions" ..*..
> 
> The neural nets are trained and marked by a different section of the
> program, which then breeds, recombines and mutates the bit-strings as
> if they were the DNA of a genetic algorithm. Etc, etc.
> 

I tried encoding a neural net as a relational database. Thus, each
record stands on it's own and the order of records is not significant
(thus the encoding is almost syntax free). The fields in the record
were (essentially):

( Source Signal, Destination Signal, Connection Strength )

Thus the database was a database of synaptic links.
The similarity can be drawn to the rule database in an expert system.
It was also kinda similar to a sparse matrix structure
(remembering that a matrix is another way to encode the nerual net).
If the connection strength of a link was zero (or close to) then the
link could be dropped from the database with no (or minimal) effect
on the network.

Various learning algorithms gave reasonable results with respect
to adjusting the connection strength but I could find no algorithm
that could actually ADD connections to the database without total
disaster setting in quite soon. Naturally I thought that a genetic
algorithm would be the go but eventually I ended up using my own
judgement to manually create the links (which worked quite well)
and use the learning algorithm to set the strength of each.

> 
> * Rather than use the bit-strings as a direct description of the
> neural net to be built, it _might_ be more useful to use them to
> create an "embryo" which you drop into a cellular automata-type
> simulation, in order to allow some sections of the bit-string to
> be more significant than others without the need for a syntax.
> 
Hmmm, easier to say than do. Lots of cellular automata have been
studied (although these are few when compared with the number that
have NOT been studied) and I have yet to see reports of neural
behaviour in these systems.

	- Tel
