Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!newshost.marcam.com!zip.eecs.umich.edu!newsxfer.itd.umich.edu!gatech!howland.reston.ans.net!news.sprintlink.net!redstone.interpath.net!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: NN Vs Stats......
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <D2H974.B7B@unx.sas.com>
Date: Mon, 16 Jan 1995 02:49:52 GMT
References: <1995Jan11.145719.1@ulkyvx.louisville.edu> <3f9r6i$imn@newsbf02.news.aol.com> <3fa1j6$1kf@usenet.INS.CWRU.Edu>
Nntp-Posting-Host: hotellng.unx.sas.com
Organization: SAS Institute Inc.
Lines: 49


In article <3fa1j6$1kf@usenet.INS.CWRU.Edu>, cc439@cleveland.Freenet.Edu (Philip M. Kalina) writes:
|> ...
|> I have used statistical models and neural nets on a variety of problems.
|> I see the basic differences as these:
|>
|>   1.  Statistics are typically used to describe a population or
|>       to test a hypothesis.  When testing a hypothesis you make
|>       assumptions about underlying pdf functions.
|>
|>       Neural netters develop models to recognize patterns or model
|>       functions to make interpolations or extrapolations (forecasts).

Statisticians do that, too.

|>       They are just beginning to learn to factor in assumptions
|>       about distribution functions.

The usual feedforward nets involve the same assumptions as common
forms of maximum likelihood estimation, since they _are_ maximum
likelihood estimation. It's just that many neural netters don't
realize this.

|>   2.  Statisticians love parsimony--that is they strive to develop
|>       the simplest model (one with the fewest parameters to estimate)
|>       possible.  Neural netters also worry about the number of weights
|>       relative to the number of training cases, but not as much as the
|>       statisticians.  Minimizing the number of weights is not the only
|>       way to keep your net from overfitting.

That's an exaggeration.  Statisticians are more concerned with
understanding and interpretation, which are facilitated by simple but
accurate models. However, it is well known in statistics that various
forms of regularization similar to weight decay provide better
prediction than reducing the number of weights.

|>   3.  Statistics are better understood.  We don't have all the neural
|>       net theory worked out yet.  We use neural nets more informally.

There is a considerable amount of statistical theory that is directly
applicable to neural nets. See the articles and books I have cited
earlier in this thread.


-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
