Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news.harvard.edu!news2.near.net!MathWorks.Com!europa.eng.gtefsd.com!howland.reston.ans.net!math.ohio-state.edu!usc!nic-nac.CSU.net!charnel.ecst.csuchico.edu!csusac!csus.edu!netcom.com!vlsi_lib
From: vlsi_lib@netcom.com (Gerard Malecki)
Subject: Alternate data representations for NNs.
Message-ID: <vlsi_libCvIJIs.Csx@netcom.com>
Organization: VLSI Libraries Incorporated
Date: Fri, 2 Sep 1994 17:35:15 GMT
Lines: 17

So far, all the neural nets that I have encountered have either the
field of real numbers or the field of binary numbers for the inputs
and outputs. Have there been attempts to study neural nets that 
operate on other fields like, for example, the complex field? Are there 
any relative advantages/disadvantages in doing so? 
 
More generally, we can extend our inputs and outputs to abstract
objects that are not even fields, like Lisp programs in genetic 
algorithms. In that case, the concepts "weight" and "threshold" 
should be generalized and so should be the operation of the perceptron.
It should also give rise to some interesting learning rules. My 
feeling is that such abstractions would give a more powerful semantic 
basis to neural nets than is possible with current implementations.
 
Ideas/suggestions are welcome to be posted back on the net.
 
Shankar Ramakrishnan
