# Mixed logical/linear programming

There are three main topics in this talk: The first one is the rethinking of the role integer variables play in combinatorial optimization, the second are the links between optimization and contraint programming, and the third one is making those links in a declerative modeling language.

It is possible to formulate without integer variables, e.g., the TSP and the QAP. With integer variables it is messy, with contraint programming it is only a single contraint, the all-different constraint.

Q: But in this framework the different complexities of the TSP and the QAP are not revealed.

A: Yes, working on it.

Other examples include piecewise linear functions, 1-machine scheduling, fixed charge network flow and resource-constrained job shop scheduling.

Integer variables have three traditional functions: Discrete modeling, branching (branch on fractions) and relaxation (continuous relaxation, Lagrangean dual, cutting planes). These functions should be distinguished. Instead we should use:

• Modeling - use logic-based and discrete modeling. Integer variables can still the modeling device of choice, e.g., 0-1 knapsack.
• Branching - branch on discrete variables (values need not be integers).
• Relaxation - very important:
• Use no relaxation.
• Use projection of traditional relaxation onto continuous variables.
• Use optimal sparating inequalities.
• Use relaxation duals, e.t.c.

The general form is

min cx s.t. h_i(y) -> A^i(x) >= b^i

for all i, where y is discrete and x is continuous. Ranges can be written in this form but they can also be handled directly in the solver.

How do we solve such a problem? We branch on discrete variables. At each node in the branching tree we solve an LP which includes all the systems A^i(x) >= b^i for all i for which h_i(y) is true. We use logical inference and constraint satisfaction techniques to try to determine if a partial assignment to y_1, ..., y_n makes h_i(y) true. In addition, drawing inferences can prune the search tree. Computational experience has been positive.

Future research in this area includes good relaxations for all-diff and other discrete constraints. Also relaxation duals (generalizations of Lagrangean and surrogate duality).

Recall the original problem. One prunes the search tree by using logical inference to reduce the domain of the discrete variables.

The essence of contraint satisfaction is to draw inferences that reduce the domain. In the process some variables may be fixed. Resolution is a special case. Recall again the original problem. Now suppose that some of the y's have been fixed by branching. Is h_i(y) true? This is an inference problem - what can we infere from the fixed y's?

A constraint set is strongly k-consistent if whenever values are assigned to y_1, ..., y_(t-1) without violating any constraints, some value may be assigned to y_t without violating any constraints and this is true for t = 1, ..., k. A constraint set is consistent it it is strongly n-consistent. If the problem is consistent, a tree search will find a feasible solution without backtracking.

Consistency can be achieved by deriving all possible resolvents, i.e., implications obtained by resolution. k-consistency can be obtained by deriving all resolvents with less than k terms.

A theorem of Freuder provides a link between problem structure, constraint generation and branching order.

How should we structure a modeling language to reflect solution options? The elements of solution method are: Enumeration of partial solutions (branching), checking whether partial solution makes h_i(y) true (constraint propagation), and solving the resulting LP (LP solver).

Other realizations of this framework are: Enumerate intervals for continuous variables, checking for constraint satisfaction with interval arithmatic, and solving constraints in enforced consequents with NLP solver.