From mjs@hubcap.clemson.edu Thu Jan 20 12:03:52 1994
Received: from hubcap.clemson.edu by orion.uwaterloo.ca with SMTP
id ; Thu, 20 Jan 94 12:03:30 -0500
Received: from localhost (mjs@localhost) by hubcap.clemson.edu (8.6.4/8.6.4) id MAA16250; Thu, 20 Jan 1994 12:04:00 -0500
Date: Thu, 20 Jan 1994 12:04:00 -0500
From: "M. J. Saltzman"
Message-Id: <199401201704.MAA16250@hubcap.clemson.edu>
To: hwolkowi@orion.uwaterloo.ca
Subject: Re: Square root algorithm patent
Newsgroups: sci.math.num-analysis
In-Reply-To:
References: <2gtdci$hfp@ground.cs.columbia.edu> <1994Jan20.022556.22404@njitgw.njit.edu>
Organization: Clemson University
Cc:
Status: R
In article you write:
>In article <1994Jan20.022556.22404@njitgw.njit.edu>,
> wrote:
>>Can algorithms be patented? As far as I know, computer programs can,
>>but algorithms can't. Can anyone find a pointer to appropriate
>>passages in the Copyright law?
>>--
>>Andrzej Pajak | "Heimat ist die Menschen die wir verstehen,
>>Department of Applied Mathemathics | und die uns verstehen" - Max Frisch.
>>New Jersey Institute of Technology | "All good people read good books" -
>>Newark, NJ (201) 491-5133 | Tanita Tikaram.
>
>I think that there was a case recently where an algorithm was patented -
>though I do not know the exact details. This had to do with the interior
>point methods which gained popularity following the work by Karmarkar.
>The court case involved AT&T where Karmarkar works. As far as I know,
>there is a patent on this work and the case was settled out of court -
>money passed to AT&T because of the patent.
>
>--
>---------------------------------------------------
>Professor Henry Wolkowicz |Fax: (519) 746-6530
>University of Waterloo |Tel: (519) 888-4567 ext. 15589
>Faculty of Mathematics |email: henry@orion.uwaterloo.ca
FYI, Here's a summary that I wrote up in response to an earlier thread on
this topic, and some other discussion of the matter.
Matthew Saltzman
Clemson University Math Sciences
mjs@clemson.edu
>From: mjs@hubcap.clemson.edu (M. J. Saltzman)
>Subject: Re: The Karmarkar Algorithm (long)
>Date: 24 Mar 91 20:39:35 GMT
>
>A brief history of interior point algorithms for linear programming
>follows. I don't have all the details, and I would welcome corrections,
>additions, etc.
>
>ca. 1947:
>--------
>George Dantzig develops the well-known simplex algorithm for linear
>programming. The first "large-scale" problem is a 9x77 instance of
>the "diet problem" (find a min-cost diet that meets or exceeds
>nutritional requirements), solved by 9 clerks using hand- operated
>desk calculators, in about 120 man-days.
>
>In ensuing years, many refinements to the basic simplex algorithm
>are proposed, implemented, tested, and included or not in commercial
>LP codes. Improvements to the algorithm and advances in computer
>technology bring the size of a "large-scale" general LP up to
>around a few thousand rows and several thousand columns.
>
>1980:
>----
>Khachian publishes the ellipsoid algorithm, which proves that LP
>can be solved in time proportional to a polynomial of the length of
>the input. Implementations, however, prove not to be competitive with
>commercial simplex implementations. Part of the reason for this is
>the high degree of the polynomial, and the fact that the average case
>behavior is as bad as the worst case, and worse than the average
>simplex behavior. Part is also due to the state of the art of sparse
>linear algebra (see below).
>
>1984-present:
>------------
>Karmarkar announces his polynomial-time algorithm for LP and makes
>extravagant claims for its performance compared to the simlex method.
>He refuses to release certain implementation details, as well as the
>data for problem instances he reports on. Much gnashing of teeth
>ensues in the optimization community. Several independent attempts to
>implement the method show that the number of iterations is indeed as
>small as predicted, but each iteration is expensive. A set of LP test
>problems is made available on Netlib as a standard of comparison.
>AT&T announces KORBX (an acronym for nothing, it infringes on no
>trademarks), an 8-processor Alliant computer and software package,
>priced a $8.9 *million*. AT&T patents certain features of KORBX,
>raising again the issue of whether software is patentable. (As of
>this moment, the patents stand.)
>
>Two papers by Adler, Karmarkar, et al., describe an implementation of
>the algorithm, revealing the key to reducing the cost of an iteration.
>This has to do with advances in the last decade in the state of the
>art of sparse linear algebra. Developed mainly in the context of
>finite element and other engineering problems, these techniques apply
>graph theory to the analysis of sparse, symmetric, positive definite
>matrices to control the density of the Cholesky factors. (Note: these
>techniques appear to be applicable to the ellipsoid algorithms as
>well, but Karmarkar's algorithm has a lower-order polynomial
>complexity.)
>
>Gill, Murray, Saunders, Tomlin and Wright show that Karmarkar's
>algorithm is equivalent to a projected Newton barrier algorithm. This
>is the key to the current class of implementations. Current
>successful implementations (such as Lustig, Marsten and Shanno's OB1)
>seem to be settling in on a version of the algorithm known as the
>"primal-dual barrier method," including some significant algorithmic
>enhancements over the basic method. Commercial implementations
>include OB1, IBM's OSL, and one from AT&T (I'm not sure of the status
>of KORBX these days, but it was not ever a commercial success).
>
>Research continues into many areas of theory and practice in interior
>point algorithms. It has proved to be a very exciting and productive
>area of research.
>
>One side effect of this research has been to revive interest in
>updating the simplex method to take advantage of new computer
>technology and new insights into problem structure. Dramatic advances
>have been made in this area as well, in commercial and research codes
>such as OSL, Bixby's CPLEX, and Saunders's MINOS. The best
>implemetations of both methods trade off fastest times on different
>classes and sizes of problems, leapfrogging each other on a regular
>basis. "Large-scale" problems are now in the 10's or 100's of
>thousands of constraints and variables, and are solved routinely on
>workstations and supercomputers.
>
>References:
>----------
>
> ADLER, KARMARKAR, RESENDE, and VEIGA (1989) "An Implementation of
> Karmarkar's Algorithm for Linear Programming". _Math._Prog._
> volume 44, pp. 297-335
>
> ADLER, KARMARKER, RESENDE and VIEGA (1989) "Data Structures
> and Programming Techniques for the implementation of
> Karmarkar's Algorithm," _ORSA_J._Computing_1:4, pp. 84-106.
>
> GEOGE and LIU (1981), Computer Solution of Large Sparse
> Positive Definite Systems, Prentice Hall.
>
> GILL, MURRAY, SAUNDERS, TOMLIN and WRIGHT (1986) "On Projected
> Newton Barrier Methods for Linear Programming, and an
> Equivalence to Karmarkar's Projective Method."
> _Math._Prog._ vol. 36, p. 183
>
> LUSTIG, MARSTEN and SHANNO (1989) "Computational Experience
> with a Primal-Dual Interior Point Method for Linear
> Programming," Ga. Tech. Technical Report J-89-11.
>
> LUSTIG, MARSTEN and SHANNO (1990) "On Implementing Mehrotra's
> Predictor-Corrector Interior Point Method for Linear
> Programming," Princeton Technical Report SOR 90-03.
>
> MARSTEN, SUBRAMANIAN, SALTZMAN, LUSTIG and SHANNO (1990),
> "Interior Point Methods for Linear Programming: Just Call
> Newton, Lagrange, and Fiacco and McCormick!"
> _Interfaces_20:4, pp. 105-116.
>
>and other citations in these papers.
>
> Matthew Saltzman
> mjs@clemson.edu
>
>
Newsgroups: comp.patents
Subject: Followup on Karmarkar's algorithm and the AT&T patent
Summary: Response and clarification on earlier posts
Expires:
Sender:
Followup-To:
Distribution:
Organization: Clemson University
Keywords:
This thread has died down some, but I finally have a chance to make
some notes on what was posted before, which I hope readers will find
interesting. It could get kind of long, as I will be quoting from a
number of earlier posts. (I'm pretty sure I have the attributions
right, but I welcome corrections.)
Matthew Saltzman
Clemson University Math Sciences
mjs@clemson.edu
It all started when:
In message <3495@cluster.cs.su.oz.au>, patents@cs.su.oz.au (Peter Treloar) wrote:
>
>In article <3494@cluster.cs.su.oz.au> Dan Bernstein wrote:
>
>>Stop hedging. Give an example. I suspect that a number of readers would
>>like you to stop beating around the bush. Name one software patent which
>>has shown a benefit to society. Well?
>
>I would give the example of US4744028 which is (I think ?) Karmarkar's
>linear programming patent titled: " Methods and Apparatus For Efficent
>Resource Allocation".
>
>This patent shows an improved method for linear programming that is
>apparently far superior to previous methods...
>
>I quote from page 2:
> "The best known prior art approach to solving allocation problems
> posed as linear programming models is known as the simplex method,
> invented by George B. Danzig in 1947...."
^^^^^^
Dantzig. Is it really misspelled in the patent application?
>Hence this is the first substantial improvement for approximately 50
>years.
A case can be made for prior art, though. Karmarkar's method (as
described in his original paper in Combinatorica (1984) contains some
innovative ideas that serve mostly to advance the proof that the
algorithm is polynomial. Gill, et al. (1986) showed that in fact,
Karmarkar's method was equivalent to a projected Newton barrier
algorithm. This is a well-known method for nonlinear programming: the
same basic idea is described by Fiacco and McCormick (1968), and a
section in that book describes the application of the method
specifically to linear problems. The technology to make the method
practical became available in the 1980's, with sparse matrix
techniques such as those described by George and Liu (1981). I've
even heard (but don't have a citation) that F.&McC.'s SUMT code
(Sequential Unconstrained Minimization Technique) has been shown to be
polynomial for the linear problem.
IMHO, this patent has not benefitted society. If faster LP algorithms
are a benefit to society, then the benefit has occurred despite, not
because of the patent. Furthermore, the independent developers are
producing results that are far superior to the subject of the patents.
>[deleted]
>
>Hence the patent owner (guess who - AT&T :-)), if they seriously wanted
>to obtain the maximum financial benefit from their discovery (which IMHO can
>just as easily be called a mathematical theorem) could keep the method
>(essentially an algorithm) secret for the next, say, 200 years and
>allow its usage only to the highest bidder.... The user only being
>allowed to send them a tape and gets back the results on tape and that
>is all....
Well, maybe, but the fact is that all the knowledge necessary to
"rediscover" the method was in the public domain: Newton's method,
Lagrangian duality, SUMT, and the sparse matrix methods. Once people
started looking, it was probably inevitable that the method would be
developed independently.
In fact it is probably safe to say that, so far, the patent has been
of limited value to AT&T. The division formed to market their first
offering is dormant or disbanded. They sold two systems at the
original $8.9 million(!) price. I (and others) speculate that they have
taken a fairly substantial loss on the original product. The software
is now being offered (without the accompanying computer) for O($10K-$100K).
In article <3501@cluster.cs.su.oz.au>, tmb@ai.mit.edu (Thomas M. Breuel) wrote:
>In fact, the patent and initial publications by Karmarkar were
>apparently insufficient to take advantage of the purported "great
>superiority" of Karmarkar's method (see the article quoted below for
>details). Thus, society has granted AT&T a monopoly on a method
>without forcing AT&T to fully disclose their invention. That is
>clearly against the spirit of the patent system.
>
>[reproduces an article I wrote for sci.math sketching the history of
>the research and the controversy in the research community]
I don't have the text of the patents, but the insufficient disclosure
was an issue in the *research community* long before the patents were
disclosed. (BTW, the Mathematical Programming Society has published a
position paper opposed to software patents. There is some debate
continuing in the newsletters of various OR professional societies on
this.)
Subsequent papers by Adler, Karmarkar, et al. (1989) did exposite the
key points of the implementation.
>Software patents would be much more useful if the patent holder was
>required to submit working source code in a standard programming
>language, and if the patent holder was required to demonstrate, using
>that source code, that his technique actually represents an
>improvement over some previous method.
I don't want to join the debate here, but this does seem reasonable on
its face.
In article <3524@cluster.cs.su.oz.au>, brnstnd@KRAMDEN.ACF.NYU.EDU (Dan Bernstein) wrote:
>In response to my challenge, Peter suggests that Karmarkar's algorithm
>(USP 4744028) has shown a benefit to society. In support of this, he
>says that the algorithm is a ``substantial improvement'' over the
>simplex method, and that the patent owner would have kept the algorithm
>secret for many years if a patent had not been available.
>
>Just for fun I'll demolish this argument piece by piece. First of all,
>Karmarkar's algorithm is an incredibly poor algorithm in practice. It is
>far more difficult to implement than the simplex method. It runs much
>more slowly except on very large, contrived examples. It's of some
>theoretical interest because it's guaranteed to run in polynomial time,
>and subsequent refinements have made it reasonably fast, but I don't
>know anyone who would willingly use it in place of simplex. About the
>only exceptions are problems where it's important to get an answer in a
>fixed amount of time.
Sorry to spoil the fun, but this view is based on very outdated
information. Implementations of the Newton barrier method are now
quite efficient on a number of different classes of problems. If new
developments in simplex implementations had not also occurred in the
last 5 years, it is quite likely that this would be the preferred
method for solving very large problems. It is true that Karmarkar's
original algorithm *as described in the original article* is not
practical, and it is also true that AT&T's original product was not
competitive (particularly at its price point 8^)).
>So there's no chance that the patent owner would have been able to make
>money off the method if a patent hadn't been available. (The point of
>getting a patent for such a poor method, in case anyone's wondering, is
>that the patent can still apply to later improvements which make the
>method truly useful. That hasn't happened for Karmarkar's algorithm---
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>yet.)
^^^^
This is simply false. On the other hand, improvements in the simplex
method have enabled that method to keep pace with the interior point
developments. Problems are solved routinely now with both methods
that would have been considered hopeless five years ago.
Considering the speed with which AT&T brought a product to market, and
their marketing strategy, I have a hard time believing that this was
their main motivation for filing.
>Would Karmarkar have kept his algorithm secret, as Peter suggests?
>Of course not. He would have published somewhere, and the news would
>have been of some interest to the computer science community. That's
>what people do with inventions in mathematics and computer science.
>That's what AT&T, IBM, and other big companies were doing for years
>before anyone had heard of software patents.
Interestingly, that did happen here (Karmarkar, 1984; Adler, et al.,
1989). That brings up the point of exactly what is covered by the
patents (see below).
>As is, the patent is still out there, and if someone does come up with a
>variation that's better than simplex, we'll all have to pay the patent
>owner for the privilege of using what would otherwise have been free.
>That's a benefit?
So far, the other offerings in this market are from IBM (who has a
cross-licensing agreement with AT&T, under which this patent is
covered) and XMP, Inc. (a couple of professors with a code). XMP
holds a license as a result of negotiations with AT&T, under which
they pay money to AT&T. I think it's fair to say that XMP would claim
that the method is unpatentable, but they couldn't afford to go up
against AT&T in a court case.
>Finally, is anyone willing to claim that Karmarkar's algorithm is not a
>procedure for solving a mathematical problem? I say it is. As such, it
>is not subject to a patent. There isn't any question here. The courts
>have defined ``mathematical algorithm'' as a procedure for solving a
>mathematical problem, and it's well established that mathematical
>algorithms are not subject to patents. Now if only patent examiners were
>competent to recognize mathematics, the software patent problem would go
>away.
As I understand it, the patent in this case is at least a little bit
subtle. If someone has the actual text, I would welcome a
clarification on this point. The patent actually covers aspects of
KORBX (the name of the original *product*, which included a
multiprocessor and the software to drive it). On the other hand, AT&T
would seem to want to claim that any other implementation is
infringing. I think the main claim that they push is the idea of a
"potential function" which serves as a surrogate measure of the
algorithm's progress. It was this development that allowed Karmarkar
to prove polynomiality. Its analog in Newton barrier methods is
arguable, though. I wonder if the Gill, et al., paper provides
ammunition for AT&T in this case (i.e., if you implement a Newton
barrier method, then by the mathematical equivalence, you must have
implemented a potential function, so you are infringing).
In article <3525@cluster.cs.su.oz.au>, eggert@twinsun.com (Paul Eggert) wrote:
>A while ago I asked for specific examples where software patents have helped,
>given their demonstrated harm in specific examples like the patents for public
>key cryptosystems, XOR cursors, and backing store. Responses included:
>
> A. Peter Treloar suggested the linear programming patent developed at AT&T
> Bell Labs by Karmarkar. This was an advance over previous technique, but it
> is a weak example of software patent benefits, for three reasons.
>
> 1. The patent does not disclose crucial details of AT&T's method,
> thus removing the principal claimed benefit to society of software patents.
>
> 2. AT&T's method was based on an earlier idea developed in the Soviet
> Union by Khachian, which had recently attracted wide attention; if AT&T
> had not invented the patented algorithm, someone else was likely to
> anyway. (Needless to say, Khachian's idea was _not_ patented.)
Well, no. Khachian's idea (which was not practical) is quite
different, and is not at all relevant to the discussion. It is still
interesting to ask whether Karmarkar's work is derivative (WRT the
patent, that is--it is original mathematics for sure), in light of
Gill, et al.'s result.
> 3. AT&T would most likely have published this algorithm anyway even if
> software patents were not allowed. It did publish many other software
> ideas developed at Bell Labs before software patents were allowed. Since
> academic competition in the area was fierce, since the result was
> prestigious for AT&T, and since further necessary improvements were far
> more likely if it were published than if it were kept secret, there was
> strong pressure to publish.
Or not. AT&T seems to have wanted to get a lock on the market for LP
solvers, and they thought they had something far superior to what was
available. They may have used trade-secret protection instead.
In article , bennett@math.ksu.edu (Andy Bennett) wrote:
>My admittedly imperfect understanding is that Karmekar's algorithm is not
>based on earlier ideas of Khachian. They both give polynomial upper bounds
>but they do so using quite different methods. Several commentaters have
>claimed that the algorithm would have been quickly deduced by others. I see
>no reason to believe that.
The first point is correct, as I noted above. I hope I have clarified
the second point above. Practical implementations are based on prior
art. Karmarkar's exact method is more mathematically subtle, but
correspondingly less practical. I find it hard to imagine that
someone would not eventually have put together SUMT and sparse matrix
technology and discovered that it worked well on linear programs.
>> 3. AT&T would most likely have published this algorithm anyway even if
>> software patents were not allowed. It did publish many other software
>> ideas developed at Bell Labs before software patents were allowed. Since
>> academic competition in the area was fierce, since the result was
>> prestigious for AT&T, and since further necessary improvements were far
>> more likely if it were published than if it were kept secret, there was
>> strong pressure to publish.
>
>AT&T did indeed publish many ideas earlier. At that time AT&T still had a
>monopoly on telephone service and there was much less pressure on Bell Labs
>to show a profit. Karmekar's algorithm was introduced shortly after the
>breakup of AT&T upset Bell Lab's applecart. From my conversations with people
>at Bell Labs, I have little doubt they would have kept the algorithm secret
>without patent protection.
>[...]
>I think asking for a specific beneficial patent is a loaded question.
>Obviously in any single case we are better off to have something for free
>rather than something we have to pay for. Politicians have used this in
>their campaign promises for years in the US (and I suspect elsewhere).
>The benefit of a particular patent is that we all get to know the information,
>and it is hard to prove that information in any given patent wouldn't have
>come out anyway. I certainly believe that to be the case with Karmekar's
>algorithm, but since patents were available, we'll never know for sure what
>would have happened if AT&T couldn't have patented the algorithm.
In article <3569@cluster.cs.su.oz.au>, kim@Software.Mitel.COM (Kim Letkeman) wrote:
>In article <3506@cluster.cs.su.oz.au> plains!stephens@uunet.uu.net (Ben Stephenson) writes:
>
>| I disagree that software patents will protect the little guy. A
>| little guy I know was squashed by an XOr cursor quality patent
>| claim. A large company, (large companys like IBM, Microsoft &
>| Borland own most of the patents that have been registered so far),
>| threatened by a small competitor need only to wave out some patent
>| they have and explain to the small company the cost of fighting
>| their questionable patent. Large, cash rich, companies can force
>| smaller companies into crosslicensing their technology for very
>| little by threatof court over a questionable patent.
>
>Could you please provide some evidence of a little guy being squashed
>by a big guy over a patent. I am interested in seeing information on
>this actually occuring...
As I pointed out above, XMP holds a license for the Karmarkar patent
from AT&T. AT&T didn't even have a competitive product, yet they were
prepared to pressure this small company over patent infringement.
That they didn't demand more exhorbitant terms may be testimony not to
the benevolence of AT&T, but to the tenuous nature of their case.
In article <3583@cluster.cs.su.oz.au>, zalesak@bme.unc.edu (Rudy Zalesak) wrote:
>A note I posted on sci.math brought me the information that
>Karmarkar's algorithm is very similar to algorithms already developed
>by Fiacco and McCormick in the 1960's, and that most work on "interior
>point" methods is now based on this work and not Karmarkar's work.
>
>A reference I was given is
>"Interior Point Methods for Linear Programming: Just Call Newton,
>Lagrange, and Fiacco and McCormick",
>Interfaces, July-August 1990, by Marsten, Subramanian, Saltzman, Lustig
>and Shanno. I looked it up this morning- it is very readable and
>interesting. "In 1984, Narendra Karmarkar began the 'new era of
>mathematical programming' with the publication of his landmark paper.
>Shortly thereafter his employer, AT&T, invited the professional
>mathematical programming community to roll over and die."
The authors appreciate your compliment. 8^)
>
> Rudy Zalesak
> zalesak@bme.unc.edu
>
>[pjt- so was it revolutionary or not ?? ]
The bottom line? Yes and no. The theoretical result was certainly
interesting. The claims made by Karmarkar and AT&T for their
implementation in 1984 were certainly inflated. On the other hand,
subsequent research in both interior and simplex methods has begun to
revolutionize the field, in the sense that previously-undreamt-of
models can be solved routinely now. Also, earlier methods for other
problems in integer and nonlinear programming are being rethought in
light of these developments. Fiacco & McCormick's book was recently
republished in SIAM's Classics in Applied Mathematics series. There's
no question as to the reason for this timely decision.
References:
----------
ADLER, KARMARKAR, RESENDE, and VEIGA (1989) "An Implementation of
Karmarkar's Algorithm for Linear Programming". _Math._Prog._
volume 44, pp. 297-335
ADLER, KARMARKER, RESENDE and VIEGA (1989) "Data Structures
and Programming Techniques for the implementation of
Karmarkar's Algorithm," _ORSA_J._Computing_1:4, pp. 84-106.
FIACCO and McCORMICK (1968), Nonlinear Programming: Sequential
Unconstrained Minimization Techniques, Wiley.
GEOGE and LIU (1981), Computer Solution of Large Sparse
Positive Definite Systems, Prentice Hall.
GILL, MURRAY, SAUNDERS, TOMLIN and WRIGHT (1986) "On Projected
Newton Barrier Methods for Linear Programming, and an
Equivalence to Karmarkar's Projective Method."
_Math._Prog._ vol. 36, p. 183
LUSTIG, MARSTEN and SHANNO (1989) "Computational Experience
with a Primal-Dual Interior Point Method for Linear
Programming," Ga. Tech. Technical Report J-89-11.
LUSTIG, MARSTEN and SHANNO (1990) "On Implementing Mehrotra's
Predictor-Corrector Interior Point Method for Linear
Programming," Princeton Technical Report SOR 90-03.
MARSTEN, SUBRAMANIAN, SALTZMAN, LUSTIG and SHANNO (1990),
"Interior Point Methods for Linear Programming: Just Call
Newton, Lagrange, and Fiacco and McCormick!"
_Interfaces_20:4, pp. 105-116.
--
Matthew Saltzman
Clemson University Math Sciences
mjs@clemson.edu