Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel-eecis!gatech!news.mathworks.com!newsgate.duke.edu!interpath!news.interpath.net!sas!newshost.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: Do more outputs help?
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <Dy0D3p.M6F@unx.sas.com>
Date: Fri, 20 Sep 1996 01:50:13 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References: <50i9pm$d4q@hpavua.lf.hp.com> <50vh61$ror@llnews.ll.mit.edu>
Organization: SAS Institute Inc.
Lines: 33


In article <50vh61$ror@llnews.ll.mit.edu>, Greg Heath <heath@ll.mit.edu> writes:
|> bush@lf.hp.com (Joe Bush) wrote:
|> 
|> >I'm playing with a simple back prop network with one hidden
|> >layer to predict parameters in a manufacturing process. I 
|> >have a good deal of training data.  I also have measured
|> >outputs in my data set that I am not interested in having
|> >my model predict.  Right now, this "additional information"
|> >is seemingly being wasted during the training since I have
|> >not added these to the network outputs.
|> >
|> >Q: Does it make sense to add unneeded outputs to a NN if
|> >they are available for training and then just don't use
|> >them once the model is complete?                    
|> >
|> >Joe
|> >                      
|> 
|> Unneeded outputs will make the net harder to train.

Yes, but it would be easy to construct artificial data where you
could get better generalization if you trained _with_ the unneeded
outputs. For example, this could happen if the "needed" and
"unneeded" outputs were measurements of the same thing except that
the "needed" outputs had more noise than the "unneeded" ones.
Whether such data ever occur in real life is another matter.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
