Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.sprintlink.net!redstone.interpath.net!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: Some thoughts about NNs...
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <DCHppn.Ct5@unx.sas.com>
Date: Sat, 29 Jul 1995 18:22:35 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References: <Pine.SOL.3.91.950727135450.6015B-100000@altis> <3vdmb5$h00@newsbf02.news.aol.com>
Organization: SAS Institute Inc.
Lines: 30


In article <3vdmb5$h00@newsbf02.news.aol.com>, johplummer@aol.com (JohPlummer) writes:
|> Tomasz Sarnatowicz writes:
|>
|> "In general: number of dimensions of data set has no meaning. If its big,
|> only the layer size has to be big enough, number of layers does not
|> change. On the other hand: instead of building big layers we can add one
|> or two and get better results. Three layers are enough for anything, but
|> 'enough' not always means 'good'."
|>
|> Mark Kramer, in his paper "Nonlinear Principal Component Analysis Using
|> Autoassociative Neural Networks" provides what to me seems a convincing
|> argument that 3-layer autoassociative memorys are not nearly so efficient
|> as a properly constructed 5-layer net in achieving nonlinear PCA.
|>
|> Do you have any references that show that Mark got this one wrong?

Please recall the ambiguity in counting layers: a net with one hidden
layer is considered a two-layer net by some people and a three-layer
net by other people.

If a "5-layer" net means three hidden layers, then I agree--three
hidden layers are necessary and sufficient to get a useful nonlinear
generalization of PCA.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
