Differential Privacy in a Space of Functions
Feb 8, 2012

Differential Privacy is a popular criteria used to judge whether a randomized algorithm which operates on a database of individuals may be deemed to preserve their privacy. This is a work in progress in which we demonstrate a Differentially Private algorithm when the object which is output is an entire function. Typically research has focused on the case when the output is a scalar or vector. Our approach is to add noise to the function, where the noise is a sample path of a Gaussian Process. Using some tools from the theory of the Reproducing Kernel Hilbert Spaces, we are able to determine the appropriate noise level for a wide class of functions.

This talk will be self contained in that it will not assume the prior knowledge about differential privacy, stochastic processes etc.