Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!news.sprintlink.net!redstone.interpath.net!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: NN Vs Stats......
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <D2nqML.7M9@unx.sas.com>
Date: Thu, 19 Jan 1995 14:51:57 GMT
References: <1995Jan11.145719.1@ulkyvx.louisville.edu> <3fi9ec$jus@maui.cs.ucla.edu> <3fk4rc$pud@nyx10.cs.du.edu>
Nntp-Posting-Host: hotellng.unx.sas.com
Organization: SAS Institute Inc.
Lines: 18


In article <3fk4rc$pud@nyx10.cs.du.edu>, abuslik@nyx10.cs.du.edu (Arthur Buslik) writes:
|> I would argue that statistics involves obtaining estimates of parameters
|> by sampling a population.  Given a statistical model representing a
|> population, one uses certain random variables (e.g., a sample mean) to
|> estimate population parameters (e.g., the population mean).  The ANN used
|> for function approximation does not have an underlying statistical model;
|> there are no random variables.  This appears to be one difference.

As soon as you use least squares for training, you have implicit
distributional assumptions, since least squares is maximum likelihood
for a normal distribution of noise.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
