Beyond Value Selection Heuristics
In the first part of this talk, we study classic integer programming value selection heuristics when solving Knapsack problems with a branch-and-bound approach. An empirical study reveals a characterstic development of the probability that a value heuristic assigns the "correct" value as a function over search-depth. Interestingly, the same function is observed for heuristics that do not depend on the linear relaxation. We therefore conjecture that improved accuracy of value selection heuristics is caused by a self-enforcing effect of heuristics that favorably affect the distribution of subproblems where the heuristic choice matters.
In the second part of the talk, we devise a hybrid value-selection algorithm to boost the performance of complete search on underconstrained problems. We suggest to use random variable selection in combination with restarts, augmented by a coarse-grained local search algorithm that "learns" favorable value heuristics over the course of several restarts. Numerical results show that this method can speed-up complete search by orders of magnitude.
Joint work with Carlos Ansotegui, Warren Schudy, and Daniel Leventhal
Meinolf Sellmann serves as Assistant Professor at Brown University since 2004. He received his Diploma of Computer Science in 1997 from Paderborn University (Germany). After a year at Lucent Technology Bell Labs (Murray Hill, NJ) he returned to his alma mater in Paderborn in 1998 from where he received his PhD in 2002. Before coming to Brown, he spent sixteen months as Postdoctoral Associate at Cornell University.
While an important aim for Professor Sellmann is to provide actual software systems that can tackle real-world applications efficiently, the abstraction and generalization of originally problem-tailored approaches to standard solution methods that facilitate algorithm design and algorithm engineering for constraint satisfaction and constrained optimization is a key part of his work. Main methodological contributions of his research are the development of Symmetry Breaking by Dominance Detection, Structural Symmetry Breaking, Streamlined Constraint Reasoning, CP-based Column Generation, CP-based Lagrangian Relaxation, and the introduction of associated theoretical notions like Relaxed Consistency and Approximated Consistency.