Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!gatech!emf.emf.net!overload.lbl.gov!lll-winken.llnl.gov!uop!csus.edu!csulb.edu!info.ucla.edu!library.ucla.edu!news.ucdavis.edu!sunnyboy.water.ca.gov!sunnyboy!nsandhu
From: nsandhu@venice.water.ca.gov (Nicky Sandhu)
Subject: Backprop on %error
Message-ID: <NSANDHU.95Jun2110405@grizzly.water.ca.gov>
Sender: news@sunnyboy.water.ca.gov
Organization: Calif. Dept. of Water Resources
Date: Fri, 2 Jun 1995 19:04:05 GMT
Lines: 16

Hi,
	In my attempt to solve the problem of using the objective
funtion to be % error I modified the backprop algo as follows:
	Proceeding in a way similar for the techniques used for sum
squared error.
	Delta^ = Del(error**2)/Del(w)
Comment: Del == Partial derivative operator
where error = ((target-model)/target)**2
	(Note: target != 0)
	I recalculated the weight updates using this error terms
partial derivatives. I ran the program. It did optimize differently
and gave better results when looking at % error. 
	My question is "Even though its working, is this approach
valid or do I have to resort to simulated annealing etc?" I would
like to hear any arguments or objections that anyone might have.
-Nicky Sandhu
