Newsgroups: comp.lang.prolog
Path: cantaloupe.srv.cs.cmu.edu!rochester!cornellcs!newsstand.cit.cornell.edu!portc01.blue.aol.com!portc02.blue.aol.com!howland.erols.net!news.bbnplanet.com!cam-news-hub1.bbnplanet.com!uunet!in3.uu.net!uucp3.uu.net!allegra!akalice!baldy.research.att.com!user
From: pereira@research.att.com (Fernando Pereira)
Subject: Re: Minimum value of predicate? (Efficiency)
X-Nntp-Posting-Host: baldy.research.att.com
Message-ID: <pereira-3101972113150001@baldy.research.att.com>
Sender: news@research.att.com (netnews <9149-80593> 0112740)
Organization: AT&T Research
References: <32DD1DC9.47E6@info.ucl.ac.be> <pereira-2001972020370001@baldy.research.att.com> <5cglet$9g6@mulga.cs.mu.OZ.AU> <1997Jan27.140001.9455@let.rug.nl>
Date: Sat, 1 Feb 1997 01:13:15 GMT
Lines: 34

In article <1997Jan27.140001.9455@let.rug.nl>, vannoord@let.rug.nl
(Gertjan van Noord) wrote:
> >pereira@research.att.com (Fernando Pereira) writes:
> >
> >But also "in practice", many algorithms that are worse than O(n log n) are
> >used on data that does not exercise their worst-case behavior. It would be
> >of course nice to be able to prove that the actual data are in restricted
> >classes for which the algorithm has better worst-case behavior, but that
> >is either untrue or very hard to prove. Here are a few examples in which
> >algorithms with poor worst-case behavior are used on large problems
> >successfully because of the nature of the data:
> >
> >- graph and map layout
> >- automata determinization
> >- model-checking of (finite-state) specifications
> >- traveling salesman
> >- set cover
> >- boolean satisfiability using randomized local search
> [...]
> I fully agree with the main point. However, I am surprised to see
> traveling salesman in this list. Isn't it the case that in any 
> serious application approximation algorithms are used (which typically
> do not guarantee best solution) rather than, say, an A-star kind of
> algorithm? In my (limited) experience traveling salesman is one of
> those problems in which almost all instances _are_ extremely hard
> to solve within reasonable time bounds. 
I was writing in broad terms so I conflated optimization problems with the
others. The point is that hard problems may have practical solutions,
either because of peculiarities in the distribution of problem instances
of interest or because approximate solutions are possible. But you are
right that technically these are two quite different ways in which
worst-case bounds are not reached. The general lesson here is that if you
care about a problem, intractability is a challenge to the ingenuity of
the practical algorithm designer rather than a recommendation to give up.
