Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!europa.eng.gtefsd.com!howland.reston.ans.net!agate!hpg30a.csc.cuhk.hk!hkuxb.hku.hk!h9218252
From: h9218252@hkuxb.hku.hk (IP Yu Ting)
Subject: Re: another reason for sigmoid functions....
Message-ID: <D4FzsF.GrM@hkuxb.hku.hk>
Sender: usenet@hkuxb.hku.hk (USENET News System)
Nntp-Posting-Host: hkuxb.hku.hk
Organization: The University of Hong Kong
X-Newsreader: TIN [version 1.1 PL6]
References: <Mark_Vogt.17.001168E9@qmgate.anl.gov>
Date: Thu, 23 Feb 1995 07:36:15 GMT
Lines: 16

Mark Edward Vogt (Mark_Vogt@qmgate.anl.gov) wrote:
: Don't forget that the sigmoid function, by virtue of its exponential 
: characteristics, is infinitely differentiable.......
: 
: Mark out.
: 1

For BP, the first derivative of transfer functions must exist.  Other
than that, what is so special about infinitely differentiability to NN
processing or learning ?

(Please don't be offended if this question appears to be foolish :-) .)
Thanx.

IP

