Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!cam-news-feed3.bbnplanet.com!news.bbnplanet.com!cam-news-hub1.bbnplanet.com!news.mathworks.com!newsgate.duke.edu!interpath!news.interpath.net!news.interpath.net!sas!newshost.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: Why more than one hidden layer ?
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <E4J62K.Bn3@unx.sas.com>
Date: Fri, 24 Jan 1997 20:54:20 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References: <5bhqge$evf$1@mark.ucdavis.edu> <5big06$5s2@news.ox.ac.uk> <5bilev$hts@camel2.mindspring.com> <5biq07$b9o@news.ox.ac.uk> <01bc07df$078290f0$cb9901be@IS3203>
Organization: SAS Institute Inc.
Lines: 30


In article <01bc07df$078290f0$cb9901be@IS3203>, "Mark Walker" <mwalker@aisvt.bfg.com> writes:
|> 
|> 
|> Patrick Juola wrote:
|> > -> SNIP <-
|> > It's been widely accepted since 1986 that a single layer suffices;> 
|> > ->SNIP<-
|> 
|> Suffices for what?

Certain theoretical results of limited practical use.

|> Isn't it true that for a specific problem, a
|> multi-hidden layered network might converge to a lower error than for a
|> single-hidden layered network?  And perhaps with less total net parameters
|> (i.e. smaller layers).

Yes, that is true. There is a rather detailed discussion of this issue,
including examples with plots of output surfaces, under "How many hidden
layers should I use?" in the Neural Network FAQ, part 3 of 7:
Generalization at ftp://ftp.sas.com/pub/neural/FAQ3.html

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
 *** Do not send me unsolicited commercial or political email! ***

