Learnability of DNF with Representation-Specific Queries
April 11, 2012
ABSTRACT:

We study the problem of PAC learning the space of DNF functions with a type of query specific to the representation of the target DNF. Specifically, given a pair of positive examples from a polynomial- sized sample, our query asks whether the two examples satisfy a term in common in the target DNF, or alternatively, how many terms they satisfy in common. We show that a number of interesting special types of DNF targets are efficiently properly learnable with this type of query, though the general problem of learning an arbitrary DNF target under an arbitrary distribution is no easier than in the traditional PAC model.

Specifically, we show that under the uniform distribution, using the numerical-valued queries, we can learn arbitrary DNF formulas; in the process, we give an algorithm for learning a monotone sum of terms. Furthermore, using the boolean-valued queries, we can properly learn any DNF with O(\log(n)) relevant variables, any DNF where each variable appears in at most O(\log(n)) terms, and any DNF having at most 2^{O(\sqrt{\log(n)})} terms, all under the uniform distribution. Under general distributions, we find that 2-term DNF are efficiently properly learnable under arbitrary distributions, as are disjoint DNF, both using boolean-valued queries. With the numerical-valued queries, we can properly learn any DNF with O(\log(n)) relevant variables under arbitrary distributions, as well as DNF having O(\log(n)) terms, and DNF for which each example can satisfy at most O(1) terms.

Other possible generalizations of the query include allowing the algorithm to ask the query for an arbitrary number of examples from the sample at once (rather than just two), or allowing the algorithm to ask the query for examples of its own construction; we show that both of these generalizations allow for efficient proper learnability of arbitrary DNF functions under arbitrary distributions.

Joint work with Avrim Blum and Jaime Carbonell. FOCS 2012 submission.