Newsgroups: comp.ai.neural-nets,comp.ai
Path: cantaloupe.srv.cs.cmu.edu!nntp.club.cc.cmu.edu!godot.cc.duq.edu!news.duke.edu!MathWorks.Com!europa.eng.gtefsd.com!howland.reston.ans.net!usc!elroy.jpl.nasa.gov!decwrl!netcomsv!ix.netcom.com!netcom.com!olea
From: olea@netcom.com (Michael Olea)
Subject: Re: Explanation & ANNs
Message-ID: <oleaCx0KwL.MC7@netcom.com>
Organization: NETCOM On-line Communication Services (408 261-4700 guest)
References: <369o2m$qtn@post.its.mcw.edu> <1994Oct1.184128.4271@news.vanderbilt.edu>
Date: Sat, 1 Oct 1994 21:55:33 GMT
Lines: 8
Xref: glinda.oz.cs.cmu.edu comp.ai.neural-nets:19211 comp.ai:24479

>One then uses decision tree induction to build a DT classifier
>that classifies as level N [of a neural net] does...

	I find this a little strange since the DT's thresholds partition
the input feature space into equivalence classes that are hyper-rectangles,
whereas layer N of a neural net partitions feature space into more general
regions.

