Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!cornellcs!newsstand.cit.cornell.edu!portc01.blue.aol.com!newsxfer3.itd.umich.edu!su-news-hub1.bbnplanet.com!cpk-news-hub1.bbnplanet.com!news.bbnplanet.com!cam-news-hub1.bbnplanet.com!news.mathworks.com!newsgate.duke.edu!interpath!news.interpath.net!news.interpath.net!sas!newshost.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: Use of the Softmax function
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <E4MyFC.6wF@unx.sas.com>
Date: Sun, 26 Jan 1997 21:59:36 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References: <5c9a23$v4@mozo.cc.purdue.edu>
Organization: SAS Institute Inc.
Lines: 24


In article <5c9a23$v4@mozo.cc.purdue.edu>, schatter@gilbreth.ecn.purdue.edu (Subrata Chatterjee) writes:
|> Can I use the softmax function if the sum of all the outputs of my network (trained on BP)
|> do not add upto 1?

No.

|> For example, I have a network which has 10 output nodes.
|> Each of these output nodes represent separate events (which are not mutually
|> exclusive) to which an input vector can be mapped to. Since I am interpreting the 
|> output of each node to be the probability of the input vector belonging to the class which
|> is represented by that node, the range of each output node is between 0 and 1.
|> However, since an input vector may belong to several classes at once, the sum of
|> the output nodes will not add upto 1.

Just use the usual logistic sigmoid function.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
 *** Do not send me unsolicited commercial or political email! ***

