Message-ID: <092321Z06051995@anon.penet.fi>
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!news.sprintlink.net!EU.net!news.eunet.fi!anon.penet.fi
Newsgroups: comp.ai.neural-nets
From: an76395@anon.penet.fi
X-Anonymously-To: comp.ai.neural-nets
Organization: Anonymous forwarding service
Reply-To: an76395@anon.penet.fi
Date: Sat,  6 May 1995 09:14:51 UTC
Subject: Re: Mult.linear regression beats NN. Why?
Lines: 37

In article <3odatr$131n@columba.udac.uu.se> palun@strix.udac.uu.se writes:
-:I have a dataset containing four continuous input variables 
-:and one output variable. It is possible to model the output 
-:based on the input using "conventional" multiple regression. 
-:In order to account for non-linear relationships which I 
-:suspected should be present, I tried to use a backprop. 
-:neural network. Surprisingly enough (to me at least) the NN
-:did not yield nearly as good results as the linear method. I 

The PLS cult of Umea Univ of course cannot get better results
with NN. Mr. Wold would not be happy.

-:Why am I surprised? Well, it's because I thought that a NN ALWAYS 
-:should be able to do AT LEAST AS GOOD as any linear statistical 
-:method... Is this assumption wrong? (I feel I might have missed 
-:something fundamental here.) 

You need to learn more about NN.

-:
-:Please, drop me a few lines if you can tell me why a NN is
-:out-performed by "traditional" linear regression.

Because you are from Umea Univ, and want to sell SimCA to NN people ?

-:
-:Thanks,
-:
-:Ulf
-:<ulf.nordlund@pal.uu.se-:


----------------------------------------------------------------------------
To find out more about the anon service, send mail to help@anon.penet.fi.
If you reply to this message, your message WILL be *automatically* anonymized
and you are allocated an anon id. Read the help file to prevent this.
Please report any problems, inappropriate use etc. to admin@anon.penet.fi.
