Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!uunet!in1.uu.net!news.tele.fi!news.funet.fi!news.lut.fi!tamminen
From: tamminen@cc.lut.fi (Leena Tamminen)
Subject: Jordan networks on SNNS
Sender: news@lut.fi (Usenet News)
Message-ID: <D7yD7I.8Kr@lut.fi>
Date: Tue, 2 May 1995 13:23:42 GMT
Nntp-Posting-Host: cc.lut.fi
Organization: Lappeenranta University of Technology, Finland
X-Newsreader: NN version 6.5.0 #3 (NOV)
Lines: 42

I am attempting to build a super simple Jordan network (1 input,
1 hidden and 1 output layer!) on the SNNS simulator. It is supposed
to learn the difference equation:

	y(n+1)=(1-a)*y(n)+a*k+b*x(n+1), 
					where the x's have an average
of zero, and the rest are fixed parameters. I created a pattern file of
inputs (x) and targets (y). The simplest version has all zero inputs,
so the target y's converge to the value k.

I tried to teach the network using the pattern file and Jordan-
Elman functions (JE_Weights for initialization, JE_BP for learning
and JE_order for update, all with default arguments). All the nodes
have identity activation functions, except the hidden layer, where
the bias term is supposed to represent the a*k-term => identityPlusBias.
The problem is that the SSE goes to zero after 1 training epoch! And
naturally the results make no sense (how can the error
be zero when the outputs are nowhere close the targets?)

I have been trying to understand SNNS for the past couple of weeks 
(the 300 page manual sure is too much for a beginner!). I managed to
get reasonable results from a function approximation network using standard
backpropagation, but the learning required about 10 times as many
epochs as the same network built using the matlab NN-toolbox, so
I suspect I might still be doing something stupid there.

Advice and suggestions are appreciated (I am a beginner in the field
of NN, so the advice need to be simple and well explained!)

		Leena	(tamminen@lut.fi)












