Theoretical Connections Between Convex Optimization and Active Learning
March 27, 2013
Stochastic convex optimization (SCO) under the first order oracle model deals with approximately optimizing a convex function over a convex set, given access to noisy function and gradient values at any point, using as few queries as possible. Active learning of one-dimensional threshold (ALT) classifiers, is a classification problem that deals with approximately locating a "threshold" on a subinterval of the real line (which is a point to the left of which labels are more likely to be negative, and to the right of which labels are more likely to be positive), given access to an oracle that returns these noisy labels, using as few queries as possible.

Exploiting the sequential nature of both problems, we establish a concrete similarity between the "Tsybakov Noise Condition" from ALT theory and "Strong/Uniform Convexity Condition" from SCO theory, and show how information-theoretic lower-bound techniques from ALT can be used to get very similar lower bound rates in SCO (and show these are tight with matching upper bounds). Time permitting, I will also show a kind of algorithmic reduction from SCO to ALT for strongly/uniformly convex functions as well as a new adaptive ALT algorithm, that was inspired from a recent adaptive SCO algorithm.

Part of this work will appear in ICML 2013 (http://jmlr.csail.mit.edu/proceedings/papers/v28/ramdas13.pdf) and part of it is in progress.