Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!news.sprintlink.net!EU.net!nova.puug.pt!ciup2.ncc.up.pt!news.uminho.pt!caeiro!si508952
From: si508952@caeiro.ci.uminho.pt (miguel francisco a.p.rocha)
Subject: Q: Mixed sigmoid-linear functions in BPN
Message-ID: <1995Jan16.171748.6294@news.uminho.pt>
Sender: newsadm@news.uminho.pt (Network News Account)
Organization: Universidade do Minho, Braga, Portugal
Date: Mon, 16 Jan 1995 17:17:48 GMT
Lines: 22


We are developing a Time Series Forecasting system based on NN's. In our 
networks we are using back-propagation in the training with our own 
implementation. The activation function is sigmoid and we were normalising the
inputs and outputs. 
Recently someone gave us the sugestion to consider sigmoid function for hidden
layers and linear to the output nodes so that we don't need to normalise.
We tried to implement this but the normal back-prop algorithm doesn't behave
well.
Someone has already tried this with back-prop!
What are the changes we have to make on the algorithm?

Please answer by e-mail

/-------------------------------------------------------------/
/              Paulo Cortez and Miguel Rocha                  /
/                                                             /
/                  e-Mail: si508952@ci.uminho.pt              /
/                          si508957@ci.uminho.pt              /
/-------------------------------------------------------------/
-- 

