Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!das-news.harvard.edu!news2.near.net!MathWorks.Com!europa.eng.gtefsd.com!howland.reston.ans.net!pipex!uunet!zib-berlin.de!fauern!rrze.uni-erlangen.de!hub-n.franken.de!ark.franken.de!ralf
From: ralf@ark.franken.de (Ralf W. Stephan)
Subject: ALN question: some result
Message-ID: <1994Oct8.184114.809@ark.franken.de>
Organization: his desk writing an article
Date: Sat, 8 Oct 1994 18:41:14 GMT
X-Newsreader: TIN [version 1.2 PL2]
Lines: 40

  Hi,

earlier I posed the question if ALN training time is faster
than usual Backprop training, even if continuous input is
involved.  I did some testing and can give at least an impression
of what the answer might look like.

The task:  the 'sonar' problem (one of the CMU benchmarks).
Machine:   a 486DX2/66 running Linux.
BP soft:   SNNSv3.2 (Rprop, learning param 0.2)
ALN soft:  atree-2.0, (inside a C program for quantizing the input)
(both compiled with gcc 2.5.8 -O2)

BP net:    60 input, 16 hidden, 2 output
ALN net:   180 input (60 3-thermometer values), 2047 leaves, 1 tree
(both trained until 0% error on test data, that is, 200 epochs
 for the BP net, and between 5 and 100 ALN epochs, av. 15 epochs)

BP training time:  about 50 seconds
ALN training time: between 5 and 30 seconds, av. 10 seconds

Errors on the test set:
BP: between 15 and 20 %
ALN: between 15 and 30 %   
(I did run the ALN more often because of its speed and my inability
 to automate the SNNS BP procedure)

Sorry for the inexactness, but I didn't want to write a paper  8-)

It would be nice if we could get more numbers on this problem, esp.
since I am no expert and surely missed some advancements. 
I would be very interested in how atree-3.0 performs on this problem.

Anyway, the story is that I'll now try to use ALNs for my speech
recognition project.

Thanks for your time.
ralf
--
You are in a different maze of little articles, all alike.
