Newsgroups: comp.ai.jair.announce
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!oitnews.harvard.edu!news.sesqui.net!dildog.lgc.com!lgc.com!cs.utexas.edu!swrinde!elroy.jpl.nasa.gov!ames!kronos.arc.nasa.gov!jair-ed
From: jair-ed@ptolemy.arc.nasa.gov
Subject: Comments 
Message-ID: <1995Jun13.212904.23785@ptolemy-ethernet.arc.nasa.gov>
Originator: jair-ed@polya.arc.nasa.gov
Lines: 132
Sender: usenet@ptolemy-ethernet.arc.nasa.gov (usenet@ptolemy.arc.nasa.gov)
Nntp-Posting-Host: polya.arc.nasa.gov
Organization: NASA/ARC Computational Sciences Division
Date: Tue, 13 Jun 1995 21:29:04 GMT
Approved: jair-ed@ptolemy.arc.nasa.gov

As previously announced, JAIR now has an experimental WWW form system
that enables readers to read and write comments on articles. According
to our log, people are reading the comments. On the other hand, it
seems that our readers are a bit shy about posting. I'd like to
encourage you get the ball rolling by posting some comments. This will
help us evaluate the system.

For instance, I'm sure many of you were intrigued by the recent
article by Russell and Subramanian. Their article certainly addresses
an interesting and important topic that many of us have thought about.
I would like to hear you think about Russell and Subramanian's
contribution to the field and the notion of "bounded optimality". In
particular, if you have something to add, or perhaps a technical
criticism, posting it would be quite helpful. In any event, please
make use of this facility!

Below I've listed the comments we've recieved to date.  To use the
comments facility, go to JAIR's home page and find the table of
contents for Volume 2.  You can read or write comments on the
individual articles listed in the table of contents simply by clicking
on the appropriate hot spots.

Steve Minton
Minton@ptolemy.arc.nasa.gov
JAIR Executive editor

---------------------------------------------------


REGARDING:
Donoho, S.K. and Rendell, L.A. (1995)
  "Rerepresenting and Restructuring Domain Theories:  A Constructive 
   Induction Approach", Volume 2, pages 411-446.

Comment:

I found the article to be extremely illuminating - particularly the
notion of incorporating structure into the representation language,
thereby creating a new and more easily searched representation
language. Here are some general comments, and I would appreciate any
reply from the author.

(i) The code economy and performance improvements in the TGCI system
need to be quantified better. For example, the economical aspects were
only mentioned but not even measured. I believe other researchers in
the Inductive field have looked at measures such as Minimum
Descriptive Length for this purpose. Whatever the metric, it would be
good to see some sort of comparison. (since TGCI clearly does seem to
have advantages here)

(ii) Secondly, the actual feature construction algorithm needs to be
explained far better than it currently is. Although the author "steps
through" the algorithm, he does not explain what the algorithm is
doing. An overview or intuitive description of the algorithm would
help substantially.

Apart from these points, I would say an excellent paper, and one which has
inspired me to study further into this area.

Best wishes,
Brendan 

Brendan Kitts 
bj@acs.bu.edu 
Wed May 31 11:08:57 1995
 
------------------------------------------------------------------------
REGARDING:

Turney, P.D. (1995)
  "Cost-Sensitive Classification: Empirical Evaluation of a Hybrid 
   Genetic Decision Tree Induction Algorithm", Volume 2, pages 369-409.

COMMENT:

After my paper was published, I found some more 
"Related Work". Haleh Vafaie and Kenneth De Jong
have published a series of papers on using
genetic algorithms for feature selection for
conventional inductive learning algorithms. They
combine GENESIS and AQ15. See:

<GA Group Publications List, http://www.cs.gmu.edu/research/gag/pubs.html>

Also, Juergen Branke has made a survey of 
neural networks and evolutionary computation
(another hybrid of genetic algorithms and
inductive learning). See:


<Juergen Branke's Homepage, http://www.cs.gmu.edu/research/gag/pubs.html>

- Peter.

Peter Turney 
peter@ai.iit.nrc.ca 
Tue Apr 11 13:28:26 1995
 

------------------------------------------------------------------------

REGARDING:
Dietterich, T.G. and Bakiri, G. (1995)
  "Solving Multiclass Learning Problems via Error-Correcting Output Codes",
   Volume 2, pages 263-286.

COMMENT:

I've found this work to be quite influential in my
work on multiclass tasks.  Rich Bankert and I have
investigated cloud classification tasks in which
errors were concentrated in only a few pairs of
classes.  I investigated whether ECOCs designed
to maximize the distance in output encoding space
between such pairs of classes would increase overall
classification accuracy.  I learned that Nick Flann
had already tackled this issue.  Like Nick, I was
unable to get this trick to work.  He refers to this
approach as min_confusion, and compared this to
randomly-generated and max-confusion designs.  I 
actually found max_confusion to perform best in the
studies I ran.  Nick (flann@nick.cs.usu.edu) has a
good theory for explaining these findings (i.e., it
concerns the relative difficulty of learning specific
output bits) and has written a preliminary report.
This motivated Nick to pursue some interesting research
directions.  I remain curious as to whether a wrapper
model approach might yield better performance on my
tasks.

David Aha
aha@AIC.NRL.Navy.Mil 
Thu Mar 30 11:45:41 1995
 
