Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!gatech!concert!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Alopex (Was Re: which software package can train nn with step function?)
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <CzMt11.13H@unx.sas.com>
Date: Mon, 21 Nov 1994 19:05:25 GMT
References:  <3agset$qu6@joyce.iol.ie>
Nntp-Posting-Host: hotellng.unx.sas.com
Organization: SAS Institute Inc.
Lines: 27


In article <3agset$qu6@joyce.iol.ie>, hourihaj@iol.ie (John D Hourihane) writes:
|> davidz@neuro.ece.wisc.edu (David Zhang) wrote:
|>
|> >... train a three-layered NN with the hardlimit (step
|> >function) activation  function?...
|>
|> (2) Alopex  -- I'm not sure where to get a paper on this one
|> even, but
|> it allows you to train networks (any connectivity, including
|> recurrent
|> nets) with any activation functions (not just differentiables)
|> according
|> to any error metirc. Extremely powerfull, but so hugely under-
|> acknowledged that I can't even give you a reference for it.

Unnikrishnan & Venugopal (1994) "Alopex: A correlation based learning
algorithm for feedforward and recurrent neural networks," Neural
Computation, 6, 469-490.

The published version is much better than the one in neuroprose.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
