Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!news.sprintlink.net!crash!mkppp.cts.com!user
From: Dean_Abbott@partech.com (dean abbott)
Subject: Re: AIM Abduction package
Organization: pgsc
Date: Fri, 20 Jan 1995 01:55:40 GMT
Message-ID: <Dean_Abbott-1901951802020001@mkppp.cts.com>
References: <3fh12e$au9$2@mhade.production.compuserve.com>
Sender: news@crash.cts.com (news subsystem)
Nntp-Posting-Host: mkppp.cts.com
Lines: 31

In article <3fh12e$au9$2@mhade.production.compuserve.com>, Hani
<100021.1236@CompuServe.COM> wrote:

> Has anybody used the AIM package of AbTech. Based on polynomial 
> networks it is astonishingly fast in certain tasks. Has there 
> been comparison with the vanilla backprop ??

Yes there have been many comparisons in conferences and some reviews.  I
know of reviews from March 11, 1991 PCWeek, Feb. 1992 IEEE Spectrum, Sept.
1994 Stocks & Commodities.  I think in general, for problems with on the
order of a few dozen or fewer features, polynomial neural nets are much
better estimators than backprop nets (I've seen it over and over again in
my comparisons and others).  If you have hundreds or thousands of inputs,
the combinatorial explosion will kill you (but then again, how many
problems have hundreds of input degrees of freedom INHERENT to the
problem--I know of many that have that many time series points, but there
is usually a high correlation between point, but that is another topic for
discussion :) ).

For classification, it is not as good because it uses a squared error
criterion, and estimates the value of the output rather than finding the
decision boundary between classes.  There is another polynomial neural
network classification tools that does quite well (uses the logistic-loss
error metric) from Barron Associates, Inc. in Charlottesville VA.

HOpe this helps.

-- 
PAR Government Systems Corp.     |
1010 Prospect St., Suite 200     |
La Jolla, CA 92037               |
