Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!europa.chnt.gtegsc.com!usenet.eel.ufl.edu!col.hp.com!csn!gw1.att.com!nntpa!bigtop!elvis!jmarkus
From: jmarkus@elvis.dr.att.com (John W. Markus)
Subject: Optimized perceptron learning?
Message-ID: <DDIL2L.LFv@bigtop.dr.att.com>
Sender: news@bigtop.dr.att.com (Netnews Administration Login)
Nntp-Posting-Host: elvis
Organization: AT&T Bell Laboratories, Denver
Date: Fri, 18 Aug 1995 16:13:33 GMT
Lines: 22


I'm designing and programming a neural net.  By doing some creative
pre-processing, I think I can get non-linear results by using a simple
perceptron.  This would be nice, as the problem could essentially be
reduced to a linear problem.  (At least the way I understand it.)

However, the pre-processing winds up creating an exponential
number of inputs to the perceptron for each input into the net.  I was 
wondering, if there were any speed optimized learning techniques for
perceptrons?  As this is a linear problem, would I be better off trying
to use statistical analysis(which I know little about)?

Eventually when I am done programming, I'd like to benchmark the net.
Are there any specific data sets that seem to be a fairly standard
benchmarking set?  If so, are they available via ftp?

Also, are there any small analog data sets(2 or 3 input 1 output) that I 
could use for testing my methodology?

John-
jmarkus@drmail.dr.att.com

