Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!europa.chnt.gtegsc.com!library.ucla.edu!news.ucdavis.edu!sunnyboy.water.ca.gov!sunnyboy!nsandhu
From: nsandhu@venice.water.ca.gov (Nicky Sandhu)
Subject: Input co-relations and their effect on ANNs ??
Message-ID: <NSANDHU.95Jun26134350@grizzly.water.ca.gov>
Sender: news@sunnyboy.water.ca.gov
Organization: Calif. Dept. of Water Resources
Date: Mon, 26 Jun 1995 21:43:50 GMT
Lines: 25


Hi there,
	
	In any black box model, particularily neural networks, I
consider sensitivity of outputs to the inputs to be an important test
of the models correct behaviour.  
	The problem is that co-relations and interdependency of inputs
may  skew the sensitivity of outputs to the inputs. For example, if
product of two inputs product is a constant then it is possible for
the neural network to neglect one and use the information from the
other one alone. 
	The above example is simpler to what I am now trying to
accomplish. I am using neural networks to model salinity in the
Sacramento - San Joaquin delta. The delta consists of a network of
channels which merge at the mouth of the delta which is connected to
the ocean. The flows and status of barriers  are used as an input to
predict the salinity. 
	Because the channels are interconnected changing flow
magnitudes or barrier status across channels at any point effects
the flows at other points. This causes some kind of co-relation
between the flows (inputs) and skews the sensitivity testing.
	My question is " Is there a technique by which I can correctly
analyze the variation in the output with respect to the input ?".
Thanks 
-Nicky
