Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!tank.news.pipex.net!pipex!howland.reston.ans.net!news.sprintlink.net!malgudi.oar.net!utnetw.utoledo.edu!news
From: dweddin@uoft02.utoledo.edu
Subject: BACKPROP QUESTION!
Message-ID: <DDov2s.2BJ@utnetw.utoledo.edu>
Lines: 51
Sender: news@utnetw.utoledo.edu (News Manager)
Organization: University of Toledo
Date: Mon, 21 Aug 1995 16:35:58 GMT



I'm looking at Backpropagation nets right now and I have a quick
question. When I'm backpropagating the error, I have to calculate
the delta sub o (hereafter referted to as D_o).

The formula that is given is....


D_o = ( d - o )( 1 - o )o


of course i am assuming the simple case of one output node. here "o" is
the output generated from the net, it is a real number from 0.0 to 1.0.
the value "d" is the desired value which will be either 0 or 1. now
here is my question...


let's assume now that the desired result is 1 (or d=1) and the calculated
result is 0.000000000001 (which rounds off to 0.0 on my computer). so
using the above formula.....I'll plug in the numbers:

D_o = ( 1 - 0.0 )*( 1 - 0.0 )*0.0
    = 0.0

so now a 0.0 is getting back propagated. so now my V and W vectors
are not going to be adjusted.


thus, if i have a really great error, then the network will not
adjust itself.



so here is my question. AM I JUST BEING STUPID AND AM MISSING THE
OBVIOUS (probably)????

should i just go back to RBF's and say "to hell with it?" 

what am i missing and how do i fix the problem????



please email direct, i am not in here often!

thanks in advance!


Don Wedding


