Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!fas-news.harvard.edu!newspump.wustl.edu!news.starnet.net!wupost!howland.reston.ans.net!news.sprintlink.net!crash!mkppp.cts.com!user
From: Dean_Abbott@partech.com (dean abbott)
Subject: Re: Looking for information on evolving neural network structure
Organization: pgsc
Date: Wed, 25 Jan 1995 20:16:01 GMT
Message-ID: <Dean_Abbott-2501951222360001@mkppp.cts.com>
References: <3feeuk$j5h@news.iastate.edu> <3g3310$ch4@news.bu.edu>
Sender: news@crash.cts.com (news subsystem)
Nntp-Posting-Host: mkppp.cts.com
Lines: 35

In article <3g3310$ch4@news.bu.edu>, laliden@retina.bu.edu (Lars Liden) wrote:

> (Timothy T Maifeld) writes:
> >I am looking for journal articles or general information on evolving
the neural
> >network structure with genetic algorithms or genetic programming.  Any
information would be greatly
> >appreciated.  
> >
> Rick Belew has a couple good overview papers of the pro's and con's of
> using GA's and back-prop.


If you are interested in non-backprop (MLP) network types, there is a 
good paper describing statistical learning networks including inductive
learning:

A.R. Barron and R.L. Barron, "Statistical Learning Networks:  A Unifying
View", 20th Symposium on the Interface, 1989.

It contains many other references as well.  They are primarily describing
the relationships between polynomial neural networks and statistical
models, including topics such as model selection and complexity
regularization, both
of which get to the heart of evolving networks structure (which structure
is better?).
There are more recent references too, but I don't remember them off the
top of my head.  Hope this helps.

Dean Abbott

-- 
PAR Government Systems Corp.     |
1010 Prospect St., Suite 200     |
La Jolla, CA 92037               |
