Algorithm configuration is an important aspect of modern data science and algorithm design. Algorithms regularly depend on tunable parameters which have a substantial effect on run-time and solution quality. Researchers have developed machine learning techniques for automated parameter tuning which use a training set of problem instances to determine a configuration with high expected performance over future instances. This line of work has inspired breakthroughs in diverse fields including combinatorial auctions, scientific computing, vehicle routing, and SAT. The resulting configuration depends on the training set, implying that without proper precautions, it may leak sensitive information about problem instances contained therein.

In this work, we address this problem by showing that a natural adaptation of the exponential mechanism provides strong privacy and utility guarantees for a wide range of algorithm configuration problems. We uncover a structural property shared among many problems which ensures this algorithmâ€™s success: these problems reduce to maximizing piecewise L-Lipschitz functions. We call upon techniques from high-dimensional geometry to efficiently implement this algorithm when the algorithm parameter space is multi-dimensional. We apply our differentially private algorithm to many configuration problems where privacy preservation is essential, such as integer quadratic programming and greedy algorithm configuration. We also show that our differentially private algorithm can be used to design auctions and pricing mechanisms based on consumer data, thus further exhibiting the algorithmâ€™s broad applicability.

This is joint work with Nina Balcan and Travis Dick.