Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!cornell!travelers.mail.cornell.edu!news.kei.com!hookup!news.mathworks.com!uunet!psinntp!nntp.hk.super.net!news.ust.hk!shunkong
From: shunkong@cs.ust.hk (*-- Michael --*)
Subject: Re: Why Sigmoid functions ??????
Message-ID: <1995Feb22.182007.26587@uxmail.ust.hk>
Sender: usenet@uxmail.ust.hk (usenet account)
Nntp-Posting-Host: cssu135.cs.ust.hk
Organization: The Hong Kong University of Science and Technology
References: <D4A17L.Cy1@hkuxb.hku.hk>
Date: Wed, 22 Feb 1995 18:20:07 GMT
Lines: 42

In article <D4A17L.Cy1@hkuxb.hku.hk>, IP Yu Ting <h9218252@hkuxa.hku.hk> wrote:
>
>Hi there,
>
>The sigmoid function is a popular choice of neuron transfer functions
>in feed-forward neural networks.  Is it because of its relationship
>with the Boltzmann distribution equation ?
>
>If it is, then what is so special about that relationship the relationship
>that makes sigmoid good for NN works ?

	One exciting feature of this curve is that both the function and
its derivatives are continous. This option works fairly well and is often
the transfer function of choice. We use this function to introduce 'nonlinearities'
to the network.

>
>What is the effect of the sigmoid constant (i.e. the temperature term) ?
>For sure we know that as T->0, it becomes a step function.  I mean, then
>what is the effect on the performance of NN of such a change ?

	It is a function that clip one's input to minimum or maximum in a 
nonlinear fashion and we don't want a "hard limiter"; the temperature term
determines the tendency of such clipping;


Michael.

-- 
                ^  ^
*************   . .   **************************
               ( @  )
                 u
                                    Michael
Shun-Kong, Michael WAI             /______/\Wai
email   : shunkong@cs.ust.hk       \______\/__/\
tel. no.: (852)-2358-8833           No big \__\/
                                           deal!

***********************************************


