Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news.harvard.edu!news2.near.net!MathWorks.Com!europa.eng.gtefsd.com!howland.reston.ans.net!EU.net!Austria.EU.net!siemens.co.at!yurt
From: yurt@cent.gud.siemens.co.at (Yurtsever Kutay PSE)
Subject: Jumps / discontinuties at the output
Sender: news@siemens.co.at (Newssoftware)
Message-ID: <1994Sep16.133710.15523@siemens.co.at>
Date: Fri, 16 Sep 1994 13:37:10 GMT
Nntp-Posting-Host: cent.gud.siemens-austria
Organization: SIEMENS AG AUSTRIA
X-Newsreader: TIN [version 1.2 PL0]
Lines: 14

How does the effect of jumps or discontinuities at the output for
small input changes effect generalization capability of an MLP. I guess
small input changes should result in small output changes for good generalization.
But what about if there are jumps at the output, what to do then ?

Thanks in advance for suggestions.

--
========================================================================
= Kutay Yurtsever          =        email: yurt@cent.gud.siemens.co.at =
= Siemens AG Austria,      =        Tel: +43 1 60171 6097              =
= Gudrunstr. 11,           =        Fax: +43 1 60171 6399              =
= A-1100 Vienna            =                                           =
========================================================================
