Newsgroups: comp.ai.fuzzy
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!nntp.sei.cmu.edu!news.cis.ohio-state.edu!math.ohio-state.edu!howland.erols.net!newsxfer2.itd.umich.edu!uunet!in3.uu.net!news.nevada.edu!news.sprintlink.net!news-ana-7.sprintlink.net!news.sprintlink.net!news-ana-24.sprintlink.net!interpath!news.interpath.net!sas!newshost.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: Max Min Functions
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <Dw713F.5A8@unx.sas.com>
Date: Thu, 15 Aug 1996 19:06:51 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References: <4tijf6$fk1@whitbeck.ncl.ac.uk> <320E79A9.78D8@cs.nthu.edu.tw>
Organization: SAS Institute Inc.
Lines: 17


In article <320E79A9.78D8@cs.nthu.edu.tw>, Jyh-Shing Roger Jang <jang@cs.nthu.edu.tw> writes:
|> You can still use gradient descent on fuzzy systems with MIN/MAX operators.
|> It's the same as using gradient descent on y = max(x, -x). The only problem
|> that could happen is when you hit x=0, in which the derivative is not defined.
|> But in practice, the chance of hitting EXACTLY points with undefined
|> derivative is almost zero.

Obviously you haven't done much testing. Ordinary gradient descent
algorithms do NOT work reliably on functions with large discontinuities
in the derivatives. There is considerable literature on this topic in
numerical analysis, usually referred to as "nonsmooth optimization".
-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
