Newsgroups: comp.ai.neural-nets
From: Malcolm@celtic.demon.co.uk (Malcolm Grandis)
Path: cantaloupe.srv.cs.cmu.edu!nntp.club.cc.cmu.edu!hudson.lm.com!news.pop.psu.edu!psuvax1!uwm.edu!math.ohio-state.edu!howland.reston.ans.net!news.sprintlink.net!demon!celtic.demon.co.uk!Malcolm
Subject: Re: Times Series Prediction using NN
References: <chris.11.000D61B3@gauss.cam.wits.ac.za> <1994Dec29.151551.26281@news.uminho.pt>
Distribution: world
Organization: None
Reply-To: Malcolm@celtic.demon.co.uk
X-Newsreader: Newswin Alpha 0.6
Lines:  15
X-Posting-Host: celtic.demon.co.uk
Date: Mon, 2 Jan 1995 00:59:03 +0000
Message-ID: <219160241wnr@celtic.demon.co.uk>
Sender: usenet@demon.co.uk

One hidden layer is about two too few to handle prediction tasks in my 
experience. Also the inputs to the net can often be reduced using some 
simple pre-processing/scoring process. Whilst this does not produce a 
pure neural net function it does allow you to use a more complex set of 
layering/interconnect neurons with little extra processing cost and at the 
same time can enormously improve accuracy of results.
Are you coding from scratch? You do not say. If you are then there are 
some other tweaks that you can use to improve performance in such an 
environment, especially if substantial history is available (as in your 
case).
 _     _
/     /
\_ELTI\_

