Newsgroups: comp.ai.neural-nets
From: dw@moorsmal.demon.co.uk (Dave Woolcock)
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!hookup!swrinde!pipex!peernews.demon.co.uk!moorsmal.demon.co.uk!dw
Subject: Backprop - and not enough data
Organization: ....
Reply-To: dw@moorsmal.demon.co.uk
X-Newsreader: Demon Internet Simple News v1.29
Lines: 14
X-Posting-Host: moorsmal.demon.co.uk
Date: Sun, 12 Feb 1995 23:40:08 +0000
Message-ID: <792632408snz@moorsmal.demon.co.uk>
Sender: usenet@demon.co.uk

I have been experimenting with a backprop NN, but the problems I am
investigating have a small number of data lines.  The thing appears to train
well, but often produces results which are way-off.  Presumably this is due
to its inadequate "experience".

I have been creating bogus data lines to train it on to "peg out" the limits
of the problem (e.g. where we know the answer will be zero, even though we
don't have an actual datum for it).  So far this has shown mixed results.

I was wondering if there are any other techniques I could use to "make my
data go further", or am I wasting my time with too many unknowns?  How do
I tell?
-- 
Dave Woolcock
