
Genetic Algorithms Digest    Wednesday, 27 June 1990    Volume 4 : Issue 11

 - Send submissions to GA-List@AIC.NRL.NAVY.MIL
 - Send administrative requests to GA-List-Request@AIC.NRL.NAVY.MIL

Today's Topics:
	- Re: using GA to generate hypothesis valid against historical data 
	- GA's in Data Compression
	- Origin of Brain speculation
	- Genetic Algorithm Tool (Looking for Beta test sites)
	- Evolving Networks - New TR
	- Call for Papers: ACM TOMACS

******************************************************************************

CALENDAR OF GA-RELATED ACTIVITIES: (with GA-List issue reference)

Workshop Foundations of GAs (v3n19)                           Jul 15-18, 1990
Conference on Simulation of Adaptive Behavior, Paris (v3n21)  Sep 24-28, 1990
Workshop Parallel Prob Solving from Nature, W Germany (v4n5)  Oct 1-3,   1990
2nd Intl Conf on Tools for AI, Washington, DC (v4n6)          Nov 6-9,   1990
4th Intl. Conference on Genetic Algorithms (v4n9)             Jul 14-17, 1991

(Send announcements of other activities to GA-List@aic.nrl.navy.mil)

******************************************************************************

Date: Tue, 19 Jun 90 09:18:45 -0500
From: Riyaz Sikora <sikora@uxh.cso.uiuc.edu>
Subject: Re: using GA to generate hypothesis valid against historical data


>>  Mr.Havener wrote: (GA-List v4n9)
>>
>>      I need to apply GA to learn from a vast database of historical
>> readings.
>> ........ [rest deleted]
>
> Mr. Hall replied: (GA-List v4n10)
>
>     I advise against using a GA (or neural net for that matter) to learn
> such concepts. There are a family of algorithms that are tuned to solve
> this problem, and you will be hard pressed to beat them with a reasonable
> amount of CPU.
>          ..........[rest deleted]

In some cases a GA can be much more effective in classification than
the partitioning algorithms, especially if the function in the hypothesis
space is badly behaved. Though, its true that in terms of computation
complexity GA fares badly. But then that might be a different issue of
trade-off between performance Vs. complexity.
I tried using GA for classification on a Bankruptcy data set, and compared
its results with ID3 and PLS1,(as shown below:)

	method		prediction accuracy	 rule-size
	------		-------------------	 ---------
	 GA		66.8%(s.d=5.52)	         5.2(s.d.=0.6)
	 ID3		56.7%(s.d.=8.98)         7 (s.d.=1.27)
	 PLS1		67.2%(s.d.=9.34)         8.63(s.d.=1.91)		
	------		-------------------      ----------------
(All the above are average of 10 runs).

The accuracy of GA was comparable to PLS1 and better than ID3, but in terms
of rule size (i.e., no. of disjuncts in the concept), which is a measure of
conciseness, GA was the best. 
I suspect that GA was able to perform good in this case because of badly
behaved H-space (that might also be the reason for the pathetic accuracy
of all the 3 algorithms).

> ....You might want to look at Larry Rendell's paper in the proceedings
> of the first GA conference for an example of combining a GA with a 
> split-based concept learner (PLS2).
>    ................
 
As a matter of fact, last semester when I took Prof. Larry Rendell's
course on Machine Learning, I developed a double layered learning system
for my project. It is similar to PLS2 in the sense that it combines GA with
PLS1 but it is functionally very different. I wont go into the details, but
I got the following results when I ran the DLS algorithm on the same data.

	method		prediction accuracy	rule-size
	------		-------------------	---------
	 GA		      66.8%		   5.2
	 PLS1		      67.2%		   8.63
	 DLS		      69.4%		   4.6
	------		-------------------	---------
Although the improvement is little, it shows that GA can be effectively
used in classification problems.
I have tested the algorithm only on bankruptcy data and would be interested
in using different real world data sets. I would appreciate any help in
obtaining data sets.

Riyaz Sikora
Dept. of Business Administration,
Univ. of Illinois,Urbana-Champaign.
sikora@uxh.cso.uiuc.edu.	  
 
-------------------------------------

Date: Wed, 20 Jun 90 14:06:19 EDT
From: Ubit Azwar <azwar@cs.duke.edu>
Subject: GA's in Data Compression

      I was wondering if anyone has heard of Genetic Algorithm applications
      in Data Compression. If so, can you please send me some references
      of those works ?
      Thank you.
      Ubit Azwar.

---------------------------------------

Date: Mon, 25 Jun 90 18:41:07 pdt
From: kingsley@hpwrc02.hp.com
Subject: Origin of Brain speculation

Would anyone like to speculate about which part of our brain comes
from our fathers, and which part comes from our mothers? Is it
half and half? Does the right half come from Mom and the left half 
from Dad? Visa-versa? Or, considering that humans have 23 genes 
which cross over during reproduction, then the Mom and Dad parts
could be separated by a 23 dimensional hyperplane. But what are 
the dimensions? Space? Neurons? Synapses? Axons, dendrites or 
thresholds? Cell adhesion molecules?

Has anyone seen studies of the genetics of intelligence, akin to 
Mendels studies of peas? Or, has anyone seen studies of the
distribution of brain genes along a chromosome? Has anyone
mapped the genes for brains?

And why does everyone have a cortex, a cerebellum, cortical columns
and a corpus callosum? Why doesn't the material we inherit from 
our parents just make a blob? 

						Kingsley Morse
						kingsley@hpwrc02.hp.com

-------------------------------------

Date: Tue, 26 Jun 90 21:57:47 GMT
From: bayer@mpad.span
Subject: Genetic Algorithm Tool (Looking for Beta test sites)

  The Software Technology Branch at NASA's Johnson Space
  Center is developing a genetic algorithm (GA) tool for 
  search and optimization.  The main goal of this development 
  project is to develop a generic and flexible GA tool that 
  can be used for GA research and education, and that can be 
  augmented for specific applications (e.g., job shop scheduling).

  This announcement is a solicitation for participants in a beta 
  evaluation of the tool.  (A version 1.0 release of the tool is 
  targeted for the end of the summer.)  Beta testing will be 
  informal and will seek usability information, bug reports, and 
  user requirements (critical changes and wish lists).

  The beta version of the GA tool was developed using the C language, 
  for portability, on the Apple Macintosh.  (Future versions of the
  tool will support other platforms.)  The tool provides multiple 
  selection operators (e.g., roulette wheel, tournament, etc.) as 
  well as multiple crossover and mutation operators for both chromosome 
  (i.e., bit string) and permutation problems.

  A graphical user interface has been developed using the
  Macintosh toolbox.  This interface allows menu selection of genetic 
  operators, easy definition of control parameters and problem encoding, 
  and graphical display of statistical information as the genetic 
  algorithms are executing. (A simple ascii interface is also available 
  and the development of an X-Windows interface is being considered).

  The GA tool will be distributed with preliminary User's and
  Reference manuals and several stand-alone, executable examples--for 
  example, a traveling salesperson problem and a polynomial maximization
  problem--that will run on any Apple Macintosh.  To create other 
  applications, a fitness module will need to be created using the C 
  language and linked with the GA and user interface code.  For the beta 
  version of the tool, this will require version 4.0 of Symantec's Think C 
  for the Apple Macintosh--a relatively inexpensive C compiler.

  If you would be interested in participating in this beta evaluation, 
  please reply to:
	  Steve Bayer
	  The MITRE Corporation
	  1120 NASA Road One, Suite 600
	  Houston, TX  77058


	  (713) 333-0978
	  bayer@mpad.span.nasa.gov

------------------------------------------

Date: Tue, 26 Jun 90 05:26:18 PDT
From: rbelew@UCSD.EDU (Rik Belew)
Subject: Evolving Networks - New TR


			  EVOLVING NETWORKS:
		     USING THE GENETIC ALGORITHM
		     WITH CONNECTIONIST LEARNING
				   
			   Richard K. Belew
			    John McInerney
			 Nicolaus Schraudolf
				   
	      Cognitive Computer Science Research Group
	       Computer Science & Engr. Dept. (C-014)
		    Univ. California at San Diego
			  La Jolla, CA 92093
			   rik@cs.ucsd.edu
				   
		    CSE Technical Report #CS90-174
			      June, 1990
				   
			       ABSTRACT

  It is appealing to consider hybrids of neural-network learning
  algorithms with evolutionary search procedures, simply because Nature
  has so successfully done so.  In fact, computational models of
  learning and evolution offer theoretical biology new tools for
  addressing questions about Nature that have dogged that field since
  Darwin.  The concern of this paper, however, is strictly
  artificial: Can hybrids of connectionist learning algorithms and
  genetic algorithms produce more efficient and effective algorithms
  than either technique applied in isolation?  The paper begins with a
  survey of recent work (by us and others) that combines Holland's
  Genetic Algorithm (GA) with connectionist techniques and delineates
  some of the basic design problems these hybrids share.  This analysis
  suggests the dangers of overly literal representations of the network
  on the genome (e.g., encoding each weight explicitly).  A preliminary
  set of experiments that use the GA to find unusual but successful
  values for BP parameters (learning rate, momentum) are also reported.
  The focus of the report is a series of experiments that use the GA to
  explore the space of initial weight values, from which two different
  gradient techniques (conjugate gradient and back propagation) are then
  allowed to optimize.  We find that use of the GA provides much greater
  confidence in the face of the stochastic variation that can plague
  gradient techniques, and can also allow training times to be reduced
  by as much as two orders of magnitude.  Computational trade-offs
  between BP and the GA are considered, including discussion of a
  software facility that exploits the parallelism inherent in GA/BP
  hybrids.  This evidence leads us to conclude that the GA's GLOBAL
  SAMPLING characteristics compliment connectionist  LOCAL SEARCH
  techniques well, leading to efficient and reliable hybrids.

	--------------------------------------------------

  If possible, please obtain a postscript version of this technical report
  from the pub/neuroprose directory at cheops.cis.ohio-state.edu.
  Here are the directions:

  /*** 	Note:  This file is not yet in place.  Give us a few days,	***/
  /***	say until after 4th of July weekend, before you try to get it.	***/

  unix> ftp cheops.cis.ohio-state.edu          # (or ftp 128.146.8.62)
  Name (cheops.cis.ohio-state.edu:): anonymous
  Password (cheops.cis.ohio-state.edu:anonymous): neuron
  ftp> cd pub/neuroprose
  ftp> type binary
  ftp> get
  (remote-file) evol-net.ps.Z
  (local-file) foo.ps.Z
  ftp> quit
  unix> uncompress foo.ps.Z
  unix> lpr -P(your_local_postscript_printer) foo.ps

  /*** 	Note:  This file is not yet in place.  Give us a few days,	***/
  /***	say until after 4th of July weekend, before you try to get it.	***/

  If you do not have access to a postscript printer, copies of this
  technical report can be obtained by sending requests to:

	  Kathleen Hutcheson
	  CSE Department (C-014)
	  Univ. Calif. -- San Diego
	  La Jolla, CA 92093

  Ask for CSE Technical Report #CS90-174, and enclose $3.00 to cover
  the cost of publication and postage.

--------------------------------------------

Date: Mon, 18 Jun 90 21:16:35 -0400
From: Paul Fishwick <fishwick@fish.cis.ufl.edu>
Subject: ACM TOMACS

   *******************************************************************
   *                     CALL FOR PAPERS FOR                         *
   *  ACM TRANSACTIONS ON MODELING AND COMPUTER SIMULATION (TOMACS)  *
   *******************************************************************

PURPOSE AND SCOPE: Papers are solicited for TOMACS, an ACM quarterly treating
all aspects of computer simulation and the modeling of complex systems that
are studied through simulation techniques.  The goal of the Editorial Board
is that TOMACS shall be the archival journal for research in discrete event
simulation, which is the topical emphasis.  Distinctive application papers
and outstanding reviews or tutorials are appropriate within the research
focus of TOMACS.  The breadth of subject coverage is illustrated below:

o combined simulation	            o credibility assessment
o distributed simulation              o experiment design
o hybrid simulation                     o model complexity
o model development environments          o model diagnostic techniques
o model management                          o model representation
o model verification and validation           o modeling methodology
o modeling theory and concepts                  o output analysis techniques
o parallel simulation                             o process generators
o program generators                              o qualitative simulation
o random number generation                      o random number testing
o real-time simulators                        o run-time efficiency
o simulation and artificial intelligence    o simulation and computer gaming
o simulation and computer graphics       o simulation applications
o simulation metamodels	               o simulation programming languages
o specification languages           o transformation of random variates
o variance reduction              o visual interactive simulation

SUBMISSION: Five copies of the complete manuscript should be submitted to
the Editor-in-Chief. Submission of a manuscript is a representation that 
the paper has not been published or submitted for publication elsewhere 
and that the work has been released for open publication.

REVIEW: Submissions are assigned to an Area Editor, who will handle the
review process and recommend disposition to the Editor-in-Chief.  Area
definition and editorial appointments are in process.

     Richard E. Nance,  Editor-in-Chief
     Systems Research Center,
     320 Femoyer Hall
     Virginia Polytechnic Institute and State University
     Blacksburg, Virginia  24061-0251, U.S.A.

     Phone:     (703) 231-6144
     Internet:  nance@vtopus.cs.vt.edu
     Bitnet:    nance@vtvm1.cc.vt.edu
     Facsimile: (703) 231-6075

EDITORIAL ADVISORY BOARD:

     George S. Fishman	              James O. Henriksen
     University of North Carolina     Wolverine Software Corporation

     Robert G. Sargent                Bernard P. Zeigler
     Syracuse University              University of Arizona

AREA EDITORS:

     Osman Balci                      Quality Assurance
      Virginia Polytechnic Institute   (credibility assessment, model
            & State University          V&V, model diagnosis, simulation
                                        environments, simulation support)

     Luc Devroye                      Representation of Uncertainty
      McGill University                (random number generation, testing,
                                        transformations, data modeling
                                        distribution fitting)

     Paul A. Fishwick                 Extensions and Relations
      University of Florida            (knowledge-based, qualitative,
                                        combined and hybrid simulation,
                                        expert systems, AI)   

     Richard Fujimoto                 Model Execution
      Georgia Institute of Technology  (distributed and parallel simulation,
                                        run-time organization (events lists,
                                        etc.), real-time simulation)

     Philip Heidelberger              Simulation Analysis
      IBM Research Division            (output analysis, variance reducing
                                        techniques, experiment design, 
                                        sensitivity analysis, metamodels)

     Stephen D. Roberts               Modeling: Concepts to Implementation
       Regenstrief Institute           (modeling theory, modeling concepts
                                        and structures, methodologies,
                                        specification languages, SPLs)

           Vacant                     User/Model Interactions
                                       (decision aiding and support, computer
                                        gaming, model management, case studies,
                                        graphical modeling, model animation)


--------------------------------
End of Genetic Algorithms Digest
********************************

