Newsgroups: comp.ai,comp.ai.edu,comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!news.alpha.net!uwm.edu!fnnews.fnal.gov!gw2.att.com!nntpa!ssbunews!not-for-mail
From: flb@odutsa.nw.att.com (blackmond)
Subject: perceptrons
Message-ID: <D6IM6C.48s@ssbunews.ih.att.com>
Sender: news@ssbunews.ih.att.com (Netnews Administration)
Nntp-Posting-Host: odutsa.nw.att.com
Organization: AT&T
Distribution: usa
Date: Tue, 4 Apr 1995 14:42:12 GMT
Lines: 11
Xref: glinda.oz.cs.cmu.edu comp.ai:28782 comp.ai.edu:2429 comp.ai.neural-nets:23247

If anyone out there can help me answer this question I would appreciate
it.

If the activation function of all hidden units is linear, show that
a multilayer perceptron is equivalent to a single-layer perceptron.


Thanks in advance for your help.
Francine Blackmond
(708) 224-2381

