Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!satisfied.elf.com!news.mathworks.com!uunet!psinntp!nntp.hk.super.net!news.ust.hk!shunkong
From: shunkong@cs.ust.hk (*-- Michael --*)
Subject: Re: No. of hidden neurons ...
Message-ID: <1995Feb17.080704.26746@uxmail.ust.hk>
Sender: usenet@uxmail.ust.hk (usenet account)
Nntp-Posting-Host: cssu135.cs.ust.hk
Organization: The Hong Kong University of Science and Technology
References: <D44LrC.IM2@hkuxb.hku.hk>
Date: Fri, 17 Feb 1995 08:07:04 GMT
Lines: 42

In article <D44LrC.IM2@hkuxb.hku.hk>, IP Yu Ting <h9218252@hkuxa.hku.hk> wrote:
>
>Hi there,
>
>Is it true that any multilayer perceptron with ONE hidden layer can
>realize ANY function ?  Or just binary functions ?

	In most cases, one hidden layer is enough with adequate number of 
hidden layer nodes to realize any function; not just binary function. :)

>
>What about continuous valued functions ?  What is the minimum
>architecture that will enable mapping of p continuous valued patterns
>to a single continous output ?

	I have read a paper about the number of hidden layer nodes, it says
something about hidden=sqroot(input * output); so, applying this formula,
if I understand it correctly, the architecture should be :

	output: 1
	hidden: sqroot(p)
	input : p

Any comments ?


Michael.

-- 
                ^  ^
*************   . .   **************************
               ( @  )
                 u
                                    Michael
Shun-Kong, Michael WAI             /______/\Wai
email   : shunkong@cs.ust.hk       \______\/__/\
tel. no.: (852)-2358-8833           No big \__\/
                                           deal!

***********************************************


